DDR4 vs. DDR5, The Best Memory For Gamers? Core i9-12900K, 41 Game Benchmark

Поделиться
HTML-код
  • Опубликовано: 11 сен 2024

Комментарии • 1,1 тыс.

  • @Bruh-gx1ey
    @Bruh-gx1ey 2 года назад +637

    41 games at 3 resolutions! You guys put in so much effort into your content, much appreciated! Thanks Steve!

    • @_draxin_0514
      @_draxin_0514 2 года назад +12

      ​@Klip One To be fair not that many people concurrently play Valorant. At least not in comparison to other games out there that have the same kind of hardcore sweatlord community members and gameplay.

    • @Neuroszima
      @Neuroszima 2 года назад +12

      As if anyone cared about valorant. More fancy CSGO at most

    • @yodaaki3652
      @yodaaki3652 2 года назад +3

      @Klip One Valorant runs on a toaster, it's not like we need DDR5 for that...
      Any current configuration is enough for valorant :)

    • @dazeen9591
      @dazeen9591 Год назад

      should've ran benchmark at 480p and 720p

    • @oliversmith2129
      @oliversmith2129 Год назад +1

      @@dazeen9591 240p

  • @bb5307
    @bb5307 2 года назад +905

    Most big gains are on Ubisoft games it seems. Lots of high bandwidth ram must be benificial when you layer the game with 3 layers of DRM and spaghetti code.

    • @jadhsyjhduyhjy
      @jadhsyjhduyhjy 2 года назад +23

      haha you are right

    • @erikhendrickson59
      @erikhendrickson59 2 года назад +34

      LMAO. I'd love to see some comparisons here between pirated (re: cracked) and "fully-DRMed." Honestly though, a lot of it just the effect of real-time asset streaming benefiting IMMENSELY from sheer memory bandwidth over lower latency -- especially now with above 4G decoding capable graphics cards to pull from said memory. Can effectively move entire 4GB chunks in one second which is massively beneficial for asset streaming in open-world games (and the ability to prevent stutter from occurring when doing so)

    • @Hadgerz
      @Hadgerz 2 года назад +19

      Being an avid war thunder player, and seeing the OVER 20% PERF INCREASE in this test, that spaghetti code comment seems very much legit.

    • @saricubra2867
      @saricubra2867 2 года назад +9

      "Ubisoft"
      You mean Buggysoft.

    • @Manakuski
      @Manakuski 2 года назад +1

      I would've imagined Call of duty: Warzone to be a title where we should see a difference

  • @JarrodsTech
    @JarrodsTech 2 года назад +316

    Nice new set area!

    • @jmashcomptech
      @jmashcomptech 2 года назад +1

      Yeah! Id like to see tim do a review of that huge monitor behind steve.

    • @argh6666
      @argh6666 2 года назад

      Stop simping HWUNBOXED man jeez

    • @o0o_ghost
      @o0o_ghost 2 года назад

      @@argh6666 Bruuuh lmao

    • @bearnord4665
      @bearnord4665 2 года назад

      Ty!

    • @atanaszarkov9442
      @atanaszarkov9442 2 года назад

      @@evalangley3985 Lol, good to know!

  • @IMarsiii
    @IMarsiii 2 года назад +232

    Again probs to you for that amazing work, testing. No bulls***t just crispy clean numbers, polished and presented for us in nice diagrams.

  • @mroutcast8515
    @mroutcast8515 2 года назад +239

    41 game benchmark - still sane, Steve??

    • @LucariosAxe
      @LucariosAxe 2 года назад +6

      PoE reference?

    • @mroutcast8515
      @mroutcast8515 2 года назад +2

      @@LucariosAxe I guess, been playing it for 8 years - phrase must have stuck with me.

    • @Boodoo4You
      @Boodoo4You 2 года назад +6

      41 games, at 3 resolutions = 123. Running each benchmark 3 times means he would’ve done at least 369 benchmark runs!

    • @marekk.3322
      @marekk.3322 2 года назад

      Spot the PoE player in the comments

    • @xlinnaeus
      @xlinnaeus 2 года назад

      11:38 why don’t you ask him yourself…?

  • @CESAR_H_ARIAS
    @CESAR_H_ARIAS 2 года назад +197

    When only the ram costs as much as a processor and motherboard combo bundle 😂😂💩 2021 has been amazing.

    • @xmaverickhunterkx
      @xmaverickhunterkx 2 года назад +17

      It's new tech. It's always especially expensive.

    • @FeintMotion
      @FeintMotion 2 года назад +5

      That isn't a supply thing, it's a yields for a new thing thing

    • @mza2001
      @mza2001 2 года назад +8

      1) Its new tech tax just like DDR4,
      2) The power architecture changed and its in the ram itself not the board
      3) Its still maturing and its a generation leap for the ethusiast since ddr4 is at its peak maximum and cant handle some applications
      3) Production is ramping up with lower latency+higher speeds
      4) Prices will mature once AM5&Z790s come out with better binned sticks for cheaper.
      5) Potential is only going to be crazier
      I dont get you guys complaining 😑, its new tech.

    • @takehirolol5962
      @takehirolol5962 2 года назад

      It was obvious since the start of time...

    • @sagerdood
      @sagerdood 2 года назад

      Buuuuuut ddr5 is cheap. Certainly not as much as a mobo

  • @dignes3446
    @dignes3446 2 года назад +30

    Hi, Hardware Unboxed. I really appreciate you guys are including a few RTS games in your benchmarks (SC2,AoEIV) please keep doing this for us RTS gamers. Our genre is not dead... its Undead!

    • @noelchristie7669
      @noelchristie7669 Год назад

      Just started playing AoE 2 and I'm having a great time, excited to try the newer games once I'm done with it!

  • @catsspat
    @catsspat 2 года назад +78

    1:51 470 USD for 32GB kit. Wow, just wow.
    Granted, I never buy "gamer" memory sticks, but even the most expensive pair of memory sticks I ever bought (2 sticks of KSM32ED8/32ME = 64GB) was less than that, at 376 USD.

    • @puciohenzap891
      @puciohenzap891 2 года назад +12

      My 6000 c36 32GB kit was €700 in Europe and it doesn't even work at 6000C36. Well played.

    • @MatarM0
      @MatarM0 2 года назад +1

      Got 32gb of kingston fury 6000 c40 for 320$ ;)
      its way below the market pricing now

    • @zakelwe
      @zakelwe 2 года назад +3

      Before Alder Lake came out Crucial had 4800 actually reduced from £160 down to £155 for 2x16GB, 90 sticks of it. Then it was not reduced, then it was £175. Now it is £250 and not in stock.
      Although not quick, that original price, in hindsight, was a bargain, even though still more expensive than DDR4.

    • @chrisvig123
      @chrisvig123 2 года назад +12

      Complete waste of money at this point…I’m not getting bent over for diminishing returns 😯

    • @LiveType
      @LiveType 2 года назад +3

      Eh, I'd rather get 128GB DDR4 for about the same price. Sure it's not "fast" memory (3200cl16), but you can open up anything you want and keep it running. Especially on Windows which has bloated up significantly with how aggressively it preloads and caches stuff. Right now I'm at 9.8GB active memory with 17 GB utilized. I literally only have a password manager, cloud file sync, and chrome with ~15 tabs open. How? Like actually how? Why does it need that much memory? That should take no more than about 6GB (that 9.8GB is about right), yet windows managed to fill up 17GB.
      If anything, I'd get 64GB DDR4 of decent Samsung B-die and tune those sticks. That's the best performance you'll get without going overboard. Memory performance is important and makes almost everything your computer does faster. Now add in an optane cache and you're off to the races.
      I'd give DDR5 another 18 months before it becomes a "better buy" than DDR4 right now.

  • @striker241186
    @striker241186 2 года назад +19

    Thanks for the tests Steve. I myself got a 12700k and kept my DDR4 3200 kit from my previous motherboard

    • @greenumbrellacorp5744
      @greenumbrellacorp5744 2 года назад +1

      3200? oc that to 3600, im sure they can go witha little voltage(if they crashed on or old system i bet that was the old imc not memory itself) i myself had 32gb ddr4 3200 that refued to run past 3200 at ANY voltage, oh.. and required a crapton of imc overvolting for my 7700k.. now.. 3600 easy.. and pushed to 3800 now, working fine (i expected 3600 to be a bit annoying but nope, 1.4v and done, 3800 1.45v), ram is fine at 1.5v anyway, that's the heatsink use(if its a heatsink and not some useless platic rgb crap over the ram that being said

    • @sebastianwallace268
      @sebastianwallace268 2 года назад

      My 3200 cl16 4x8 crucial ballistix kit is running at 3600 mhz with just 1.37 volts vs the stock 1.35. Definitely try a memory overclock. It was as simple as upping the frequency, rebooting the pc to see if it worked. Didnt touch any timings.

    • @greenumbrellacorp5744
      @greenumbrellacorp5744 2 года назад +3

      @@sebastianwallace268 u sure u kept timings?, remember to ALWAYS set em to manual when doing that as if they're on auto and outside of xmp profile specs the board can yolo it and set cl 22 timings or worse

    • @striker241186
      @striker241186 2 года назад

      @@sebastianwallace268 thanks for the tip. I overclock my cpu and gpu, but never thought of overclocking my ram

    • @sebastianwallace268
      @sebastianwallace268 2 года назад

      @@greenumbrellacorp5744 the timings are manually set to what they are for xmp except for trefi which is set to the max 65535 which is supposed to help. It may even run at a lower voltage but i know 1.37 is safe and bumped it just to insure stability.

  • @NewmanOnGaming
    @NewmanOnGaming 2 года назад +46

    CL36 is rough to use over the speed premium where you can still get solid performance out of CL14 and CL16 DDR4 kits.

    • @Dr.WhetFarts
      @Dr.WhetFarts 2 года назад +8

      well, ddr5 destroys ddr4 in several of these games and ddr5 is brand new and far from mature, we will see 8000-10000 speeds in the next few years. ddr4 sucked balls on release compared to high-end ddr3. ddr5 seems far more promising than ddr4.

    • @NewmanOnGaming
      @NewmanOnGaming 2 года назад +1

      @@Dr.WhetFarts This speeds are only as good as the CCX cache of the CPU. For Intel in certain types of work loads with would be great. As for AMD this may not correlate the same. Overall in gaming instances it's nowhere near as impressive. It's all about what you want to accomplish with said speeds and modules.

    • @Pr3tti
      @Pr3tti 2 года назад +10

      @@NewmanOnGaming "nowhere near s impressive" LOL dude in several games it was over 20% faster. That's what the chart says and that is now when DDR5 is immature and DDR4 is highly matured... as time goes on, developers will only add support and optimize for DDR5 more. It's GG DDR4 bruh.

    • @weecious
      @weecious 2 года назад +3

      @@Pr3tti It's funny that DDR5 is up to 20% faster "mostly" in Ubisoft titles... lol.

    • @kowismo
      @kowismo 2 года назад

      @@weecious didn't knew Fortnite is from Ubisoft, but okay.

  • @michaelbaldwin5953
    @michaelbaldwin5953 2 года назад +68

    Like the set , including the overclocked OLED TV with Airconditioning cooling above it 🙂

    • @humer101
      @humer101 2 года назад +2

      He is in the lower level of the house, here we call it the basement, which is nice and silent for those videos.

  • @vMaxHeadroom
    @vMaxHeadroom 2 года назад +40

    Agreed, wont be looking to upgrade from my 12700K for at least 4 to 5 years and by then it will be 15th gen with ultra cheap DDR5 or even DDR6!

    • @mikerzisu9508
      @mikerzisu9508 2 года назад +1

      Would be nice to have the platform now to upgrade to faster ddr5 down the road versus having to buy a new mobo

    • @cicciopasticcio6939
      @cicciopasticcio6939 2 года назад

      i've still got one 10700k @5.1 and i don't see one reason for upgrade...for almost 3 or 4 years

    • @bobbrock4221
      @bobbrock4221 2 года назад

      I'm waiting for DDR12 myself.

    • @weecious
      @weecious 2 года назад +1

      @@cicciopasticcio6939 You've had 10700K for 3 - 4 years..? Damn... you must be in the future then since 10th gen hasn't even been out for 2 years as of right now.

    • @silentlygaming3396
      @silentlygaming3396 2 года назад

      @@weecious I think they meant they don't see a reason to upgrade for 3 to 4 more years

  • @Tepnox
    @Tepnox 2 года назад +18

    Nice to see Riftbreaker in your test setup. This game has some pretty good implementation of newest techniques (Raytracing, VRS, FSR) and runs pretty well, despite demanding for recent hardware!

  • @warlordwossman5722
    @warlordwossman5722 2 года назад +126

    41 games benchmark? That's impressive.
    I was wondering if you could add Escape from Tarkov to the games you benchmark, it seems to be very CPU heavy and generally is a popular game, could be interesting data for many people!

    • @userise222
      @userise222 2 года назад +3

      Might be hard to reproduce same scenarios each time so not a fair comparison

    • @tyre1337
      @tyre1337 2 года назад +11

      if it's not on steam it doesn't exist

    • @whiteblack6865
      @whiteblack6865 2 года назад +2

      Factorio too! That is the perfect game to benchmark, especially that it's quite memory heavy. I don't get why they didn't include that game.

    • @warlordwossman5722
      @warlordwossman5722 2 года назад +9

      @@userise222 using your logic why would Steve have spent hours in Battlefield 2042 recreating the same situations? In tarkov you can not only choose maps but even play offline with AI turned on to make very controlled tests lol

    • @Ideost
      @Ideost 2 года назад

      I totally agree. Would be great to see popular games like Tarkov.

  • @1danbon
    @1danbon 2 года назад +12

    Thank you guys so much for including 4K. Even with RTX 30 Series and RDNA 2 it seems that 4K is still ignored in most benchmarks. I know it’s heavily GPU bound but still nice to see the data.

  • @jshico3417
    @jshico3417 2 года назад +6

    the future proof argument is so dumb, youre better off building a pc now with a plan to upgrade to a new pc for the same price in 5 years than to build a pc for the price of both and holding onto it for 10 years.

    • @tyre1337
      @tyre1337 2 года назад +2

      my friend wants to buy a 3090 so he could "play games for the next 10 years without worries"

    • @jshico3417
      @jshico3417 2 года назад

      @@tyre1337 the Maxwell titan is barely alive and that's just 7 years old in 1080p, and that's without talking about mechanical failure, my 970 required a fan swap after 5 years of use and it was a pain in the ass to get the original fans.

  • @globnomulous
    @globnomulous 2 года назад +3

    You saved me $800. I'm returning my Asus Z690 Hero and my GSkill Z5 C32 6400 and opting for the MSI Gaming Edge with 32 GB of the 3200 C16 DDR4 I already have. Just signed up on Patreon at $10/mo.

  • @sinephase
    @sinephase 2 года назад +20

    I'm curious if there's ever been a time where a new memory type wasn't matured at least somewhat in the server market before being brought to desktop?

  • @ccjh0806
    @ccjh0806 2 года назад +29

    The under recognition of CPU limitation in games is overwhelming LOL. Most of the time my 3080 isn't running above 97% utilisation

    • @vme23
      @vme23 2 года назад +9

      Pair that 3080 with an Alder Lake CPU and you'll see 99% GPU utilization in most games.
      I have an i9 12900k and my 3080Ti is maxed out in 1440p in all games that I play, even in the ones which are CPU limited like BF2042, New World, Anno 1800 etc.

    • @VanillaWahlberg
      @VanillaWahlberg 2 года назад +6

      NVidia has a lot of driver overhead, they did a video on this a few months back I believe.

    • @Vegemeister1
      @Vegemeister1 2 года назад +5

      The only way to get 100% or near-100% GPU utilization is for the CPU to prepare frame N+1 while the GPU is rendering frame N. That adds latency.

    • @TrueThanny
      @TrueThanny 2 года назад +2

      That's full utilization of the GPU.

    • @andersjjensen
      @andersjjensen 2 года назад +6

      97% is to be considered "a strong GPU bottleneck". The PCIe bus has a command latency. The BCLK fluctuates (as it should if you have spread spectrum enabled (and you should)), so the response time from the GPU can vary a bit from command batch to command batch which makes it impossible for the CPU to time everything perfectly. The GPU itself sometimes stalls on memory reads because of the dynamic-refresh nature of all DDR memory types, etc, etc, etc. Throwing "more CPU" at all of the above wont solve a single thing....
      It's basically only realistic to reach 100% GPU utilization with something like Furmark because it was specifically designed to do so. Furmark causes practically no PCIe traffic once the assets and shader programs are in VRAM. And the code that gets run on the GPU causes very little memory I/O but is insanely compute intensive.

  • @zilkeofficial
    @zilkeofficial 2 года назад +1

    0:06 what is this setup, DAMN ! 🔥

  • @GodKitty677
    @GodKitty677 2 года назад +3

    I was reading about this on another website. DDR4 4400 CL16 and DDR5 4800 CL40 have the same performance in games. In Aida64 memory tests its 63283 read DDR4 and 70195 DDR5. The FPS is the same. Going from 4800 to 6000 is two FPS and to 6400 is 1 fps.
    Another website states DDR4-3600 16-20-20-20-34 1T is just 0.9% slower than DDR5 36-36-36-76 1T Gear 2. This is at 1080p. This drops to 0.4% difference at 4k.
    Now with DDR4 4000CL15 the latency is 44.7ns and with DDR5 5600C36 the latency is 57ns. With overclocking DDR5 you can get 6500 CL28-37-37-26 CR1 Latency is 47.6ns and bandwidth is over 100GB/s in Aida64. This matters only if you are not GPU limited at 4k which you will be.
    With DDR5 5200CL36 many websites show no real performance increase with DDR4 3600C14 ahead by 1 fps. This is Far Cry 6, shown as massively better on DDR5 in this video.
    I think the whole DDR5 6000 vs DDR4 3600 is not a valid way of doing this test. DDR4 4000 and DDR4 4400 should also be there. So should the slower DDR5 speeds. From overclocking if you get the right performance kits you can change the conclusion to your liking. The DDR5 6500 CL28 overclock will have better performance but will be gpu limited at 4k. Even websites that show DDR5 massively ahead at 1080p, also show no performance increase at 4k because you become gpu limited. Average FPS remains the same, this is with DDR5-4800 CL40 vs DDR4-3200 CL22 or stock memory for the Core i9-12900K. Max Memory Bandwidth 76.8 GB/s which is Intels spec.
    What the video shows is that the higher end overclocks for DDR5 can increase performance but this is also much more experence RAM than the DDR4 3600 kit. Then ignores everything inbetween and stock speed. Really the fastest DDR4 4400k it should be used, if money is no object and you are getting DDR5 6000 or higher. Remember stock is DDR5 4800. DDR5-4800 trades blows with DDR4-4000 quite well in many reviews.
    Why would you be getting DDR4 3600 RAM if performance is the goal and the money is there for a DDR5 6000 kit. The justification for the DDR4 3600 kit is very weak, to the point you would seriously question why this comparison would happen. Thus the videos conclusion is that their DDR4 3600 CL14 overclock is not as good as DDR5 6000 CL36 kits. I did not need a video to tell me this fact. Why not just overclock to DDR5 6500 with a latency of 48ns and be done with it. If the goal is the conclusion you want. Then compare to DDR4 3600 and state 20% plus more performance in many more games.
    Here is a difference conclusion, "The DDR5 topped the field by a smidge at 1,920 by 1,080 and the Medium setting. Better memory latency still had some impact, however, as evidenced by the standard DDR5-4800 falling slightly behind the DDR4-4000 XMP at other tested resolutions and settings. But it was super-close."
    And another, "In popular titles like Rainbow Six Seige and Star Wars Jedi: Fallen Order, DDR4-3700 memory is up to 7 FPS faster than the newer DDR5-5400 standard. Although the deltas in most cases are within the margin of error, it can’t be denied that DDR4 (even overclocked) is a fair bit cheaper than standard DDR5-5400 kits which makes it hard to recommend the latter. And this is with the higher-end Z690 chipset. "
    Why ignore the slower DDR5 speeds and the faster DDr4 speeds? It makes no sense. Unless the goal is a very fast DDR5 kit vs the average DDR4 kit. Ignoring faster DDR4 kits that lower the price difference. It looks like cherry picking to get a conclusion one wants. Ignoring no real performance gain as a posiblity under other configurations. Seems that the real conclusion could be that the DDR5 kits are faster bandwidth wise over the DDR4 kits that this balances out the slower latency. That latency and bandwidth both matter but at some point faster DDR5 kits could pull away from DDR4 in performance. Just a guess.

  • @ashtonw9931
    @ashtonw9931 2 года назад +72

    Now I am curious how much latency effected this. CL14 is really good, but even normal ddr4 isn't much higher (16-20) at reasonable prices.

    • @Krenisphia
      @Krenisphia 2 года назад +5

      I wonder if the Cas Latency on DDR5 will ever reach CL30 or even lower in the future...

    • @pr0xZen
      @pr0xZen 2 года назад +14

      Having been through 3 of these generational transitions before this, I'd suggest to expect good primary, out-of-the-box XMP timings for DDR5 to be around CL30 ish well before mid-life cycle. Timings usually double vs former gen JEDEC, but each gen have had their new little bits, bobs tricks and means of optimization (including in the fab department). With DDR5 having more space for XMP timings, and the CPU memory controller no longer needing to do power management (the memory manufacturer can in theory even tune that per stick because power management is on-board the RAM module with DDR5) - odds are good that we're gonna get some blistering fast, tight timings DDR5 in the next 1-2-3 years. And you likely won't really need those golden unicorn bins of CPUs anymore, or more explicitly the CPU memory controller, to actually get good use of that high speed DDR5. High end VRMs for memory will help in that department though. Not for tons of amps ofc, so you don't need a lot of them.

    • @andersjjensen
      @andersjjensen 2 года назад +24

      Always remember that latency is counted in ticks. So DDR5 6000CL40 is the same latency as DDR4 3000CL20 (but double the bandwidth).

    • @kyzyl4915
      @kyzyl4915 2 года назад +10

      @@andersjjensen I am glad you clarified this. Lower CAS does not mean the absolute latency is lower automatically :)

    • @bingbing3464
      @bingbing3464 2 года назад +1

      @@Krenisphia already had. Tinkers and overclockers are running 6600c28 daily.

  • @maxwilliam8470
    @maxwilliam8470 2 года назад +20

    Do you want that 2%-10% framerate gain: YES
    Do you wanna pay 2x the price of the previous GEN: NO

    • @jtowensbyiii6018
      @jtowensbyiii6018 2 года назад +3

      He didn't say 2x, he said 4-8 times the cost

  • @diegoberan7883
    @diegoberan7883 2 года назад +13

    I managed to get playable framerates with an RX580 on Halo Infinite by overclocking the ram. I'm playing on 720p low ddr3-2400mhz, got almost a 50% fps increase only by setting the ram from 1600mhz to 2400mhz.
    Setup:
    Core i7 4770k 4.1ghzOC all cores
    RX 580 8gb
    4x4gb ddr3 2400mhz Hyperx Predator
    motherboard Asus z97-A
    240gb sata ssd

    • @Leo.Kalash
      @Leo.Kalash 2 года назад

      You'll get a massive boost at a reasonable price if you just move to a am4 mobo, and get that ddr4 combined with some ryzen 5

  • @HuyV
    @HuyV 2 года назад +8

    The problem is that while the frequency is a lot faster, the latency is actually worse.
    That 3600 CL14 has a latency of 7.78 ns while the 6000 CL36 has 12 ns of latency.

    • @johnboy2436
      @johnboy2436 2 года назад

      To make this simple for you... Latency doesnt matter when it only converts to FPS. This is what people like you dont understand. If DDR5 offers better FPS.. The latency is lower. SIMPLE done and dusted... SO why you converting latency numbers beats me? DDR5 is different and the latency number say nothing right now. Benchmarks DO

    • @broklond
      @broklond 2 года назад +4

      @@johnboy2436 bruh, he's talking about memory latency, not input latency xD

    • @johnboy2436
      @johnboy2436 2 года назад

      @@broklond Again, my comment stands and i know exactly what he means... I can type the MHZ and timings into a calc and get the same ns he is referring too... AGAIN FPS is the end result and tells all. THere is no need to put ram timings as 12ns ram on paper could get better FPS and therefore have less latency in-game...
      Not sure if thats confusing you? High FPS = less latency... The real world aplication of memory latency doesnt mean faster ram or more FPS... SO bruh chill with the not so smart comments. People dont sit putting memory timings for no reason. He thinks DDR5 is slower because of memory latency... ITS NOT. All these benchmarks and comparison videos involve gaming.. THATS IT. I have full addressed his misinformation.

    • @n00blamer
      @n00blamer 2 года назад +2

      @@johnboy2436 Depends on the workload. Some games were slower with DDR5, because the code was more sensitive to latency. Generally, higher bandwidth is better because there are latency mitigation techniques in both hardware and software. In hardware we of course have the cache and this can be augmented with cache-friendly data layout, which stems from software author's decisions. At 4 GHz 12 ns is 48 clock cycles, 12900k with 12-wide execution ports can miss up to 576 instructions in the worst case while waiting for memory. Of course the stall is likely to be less than the maximum because we can assume that the read request was issued way before the data dependency. For writes the latency is less relevant as they go through write commit buffer, and the data is live in the cache.

  • @ImaMac-PC
    @ImaMac-PC 2 года назад +2

    @14:00 anyone else notice the "death defying" near vertical descent in the upper left hand corner of the G.Skill box? Looks so cool against the tinted backdrop...like he was descending a mountain near sunset...I wonder if he made it down 😄 Steve, your camera is so good, that was a great shot! Love the new studio and different areas set up.

    • @iffy_too4289
      @iffy_too4289 2 года назад +1

      Until i went back a few seconds from 14:00 I was thinking "This guys a bit mental. WTF is he on about."
      Its an ant on the box. OP is "a bit imaginative".

  • @francis147741
    @francis147741 2 года назад +7

    Loving the new studio!!!

  • @Yellowswift3
    @Yellowswift3 2 года назад +1

    Love the lighting here, looks really good. I also noticed Steve giving us the eye when talking about the ram (trident-z), I think it was. :)

  • @yedrellow
    @yedrellow 2 года назад +5

    The biggest problem with the future proof argument is that within ~18 months, people will want to be using 8000 MHz ram, which won't run on current motherboards anyway.

    • @chillnspace777
      @chillnspace777 2 года назад +1

      This

    • @chillnspace777
      @chillnspace777 2 года назад

      For.me.future proofing is just going mid range tier and upgrading every some odd years. No point on top end when the midrange usually in a few years time matches or beats it

    • @mrbobgamingmemes9558
      @mrbobgamingmemes9558 2 года назад

      Why bother for high ram frequency if it nearly impossible to see the fps difference in games without fps counter ?

  • @EyesOfByes
    @EyesOfByes 2 года назад +103

    Regarding memory and gaming. According to Mojang's own devs, Minecraft is very memory dependent. If they meant latency or bandwidth I'm not sure of. So it would be interesting to see some benchmarks for Minecraft with perhaps some of the most popular mods/launchers.Or are there that many infinite variables in testing Minecraft that it would either make no sense testing, or it would take a 10hour video on the topic? 😏🤓 Why Minecraft? Because my nephew's Christmas wish is a gaming PC, duh 😉

    • @machompko9905
      @machompko9905 2 года назад +19

      You are gifting your nephew a gaming PC with DDR5???? Holy shet.

    • @sabnox9869
      @sabnox9869 2 года назад +15

      The reason there's no point in testing minecraft is because it can be played on a Haswell CPU with ddr3 at max settings with more frames than your monitor can display. If you want to get your nephew a pc maybe look into an old dell office pc with a 4th gen or newer i7 or xeon equivalent and then you just need a 1060 to 1070 performance level card and a 550-600w psu and you're golden.

    • @zeinnassar2770
      @zeinnassar2770 2 года назад +16

      @@sabnox9869 he asked about mods, those things get really demanding

    • @0xCAFEF00D
      @0xCAFEF00D 2 года назад +11

      @@zeinnassar2770 If you start benchmarking with mods you've got a huge number of variables to cover. Every mod will behave differently.
      You could make a benchmark for common libraries but if that doesn't already exist there's no sense in trying.

    • @criscrix3
      @criscrix3 2 года назад +2

      Mojang's statement is probably only related to the amount of memory though it would be interesting to see some benchmarks in heavily modded minecraft.

  • @nathanddrews
    @nathanddrews 2 года назад +9

    While it's not exactly comparable to standard ECC RAM, the extra on-chip ECC bit-flip protection of DDR5 is worth some of the premium IMO. *SOME*

  • @xgunnas32
    @xgunnas32 2 года назад +1

    i believe this is the first extensive tighter timing ddr4 vs ddr5 test, thank you HUB

  • @HeirofCarthage
    @HeirofCarthage 2 года назад +56

    Wow...why the hard push by Intel and other companies for DDR5 if it is so useless in games? Maybe games will use it one day but yea I would see no reason to get DDR5 boards for the foreseeable future. Great video! Thanks.

    • @Hybris51129
      @Hybris51129 2 года назад +10

      It goes to show that memory speed should take a back seat to sheer amount of memory. Trying to compensate for only having 8 or 16 GB of memory by getting super fast memory has been a long known waste of money. One that if we are being honest has been either ignored or even pushed by the popular tech tubers.

    • @amirpourghoureiyan1637
      @amirpourghoureiyan1637 2 года назад +19

      I imagine we’ll see the difference once they get the timings down, latency matters more these days than megatransfers

    • @4RT1LL3RY
      @4RT1LL3RY 2 года назад +30

      Mostly its for Server. DDR5 is going to offer higher capacities. This is exactly how it is at the start of each memory generation. The top end of the last gen beats out the brand new high end until they find the right combination of bandwidth, latency, etc. DDR4 took 2-3 years for it to start being superior at reasonable prices.

    • @AnticipatedHedgehog
      @AnticipatedHedgehog 2 года назад +1

      Money!

    • @skywalker1991
      @skywalker1991 2 года назад +11

      Its a tactic to make fanboys believe higher number is better , Intel simply wants you to believe Alderlake CPUs must be better cause they support ddr5 . some dumb fanboys fall for it , some just throw money on new and latest technology even if doesn't offer an y benefit.

  • @Dragon211
    @Dragon211 2 года назад +4

    solid work as usual, I'll wait until the chip shortage is over before I make any major upgrades to my system. my general rule is "if you're getting at least 40fps don't upgrade." that flashy new hardware probably is tempting to buy but you rarely need to buy it.

    • @Izzydawizzy
      @Izzydawizzy 9 месяцев назад

      This is the comment I needed to read😂😂😂

  • @moos5221
    @moos5221 2 года назад +4

    This is a great test, I just wish you would have used the widely available CL16 instead of niche CL14 DDR4 RAM.

  • @pyramidschemer4083
    @pyramidschemer4083 2 года назад +2

    You need to get some rest ASAP, your contributions are invaluable

  • @mikfhan
    @mikfhan 2 года назад +18

    Hmm, yeah DDR5 prices are ridiculous so the future proofing is probably more like having a board that can later use the DDR5 when MHz grow further and price drops to normal. Sounds like latency won't improve much from DDR4 - shame new boards can't handle old generations also - you're probably best off waiting out for DDR5 price reductions before doing the board upgrade.

    • @Vegemeister1
      @Vegemeister1 2 года назад +1

      Latency hasn't really budged since, like, 2004. Nobody should be holding out for lower latency DRAM.

    • @mikerzisu9508
      @mikerzisu9508 2 года назад

      @Klip One at this particular time. Give it a year and ddr5 will be smoking ddr4, and then you are stuck with a motherboard that doesn't support it. I am going ddr5, and upgrade to higher frequency ram later and sell what I had for probably not much than what I bought it for. Things will improve well bios updates as well. The problem is getting it atm.

    • @fightnight14
      @fightnight14 2 года назад

      DDr4 ain't going away anytime soon. I still use my DDR3 and 4th gen intel without problems and I still enjoy lots of modern games. When DDR5 speeds go up, current motherboards won't even be supporting them so you will still end up replacing it.

  • @miceforever9956
    @miceforever9956 2 года назад +3

    Thank you so much for all your hard work Hardware Unboxed :)
    Love from Denmark

  • @saumyashhah
    @saumyashhah 2 года назад +14

    Would like you to include budget DDR5 kits as 6000 is very expensive.

    • @Hardwareunboxed
      @Hardwareunboxed  2 года назад +27

      It's a safe assumption that budget DDR5 will be even worse, but should scale relative with budget DDR4.

    • @SGIMartin
      @SGIMartin 2 года назад +16

      DDR5 is not going to be a "good" buy for at least a year, as with every new iteration of memory. DDR4 was very expensive and not much fast than ddr3 when new

    • @TommyVercettisGamingNews
      @TommyVercettisGamingNews 2 года назад

      They'll be scalping PC cases next

    • @avatarion
      @avatarion 2 года назад

      Corsair just announced DDR5 6400. I'm hoping that a year from now I will be able to buy 7000 for my Zen 4.

    • @imadecoy.
      @imadecoy. 2 года назад

      I would hope that if you are buying DDR5 you've already committed to not cheaping out. It's like buying the cheapest Porsche to say you have a Porsche. :P

  • @sorincosma5185
    @sorincosma5185 2 года назад +2

    What game is running in the background on the TV? Looks really good

  • @DigitalMarksman
    @DigitalMarksman 2 года назад +7

    Hi Steve! After 6 months of DDR5 on the market, could you guys please do another short review if anything changed for these games considering windows/drivers updates, market prices, and new kits released? I would also love to see the difference between streaming/non streaming while playing those games, where it might show the real difference ddr5 vs ddr4. Thanks!

  • @mthorne074
    @mthorne074 2 года назад +20

    Considering the impact RAM speed has on the AMD platforms, I'm very interested to see how things perform there in the future

    • @greenumbrellacorp5744
      @greenumbrellacorp5744 2 года назад +3

      ram LATENCY kills ryzen... and cl40 .... we'll see, that being said, that can change vastly with 3d cache and new imc soo... yea

    • @popcorny007
      @popcorny007 2 года назад +7

      @@greenumbrellacorp5744 CL40 means nothing if you don't mention the speed.

    • @greenumbrellacorp5744
      @greenumbrellacorp5744 2 года назад +1

      @@popcorny007 well obviously talking about the modules of ddr5 shown in the video, 4800 cl 40

  • @TheIronArmenianakaGIHaigs
    @TheIronArmenianakaGIHaigs 2 года назад

    I wasn't expecting the War Thunder benchmark, I'm grateful you did it. Thank you

  • @Karazure
    @Karazure 2 года назад +13

    Glad to see Riftbreaker making into the comparison. That game is fun!

  • @marceldiezasch6192
    @marceldiezasch6192 2 года назад +37

    Someone testing Fortnite, nice to see. How exactly did you test? I'd assume replay mode? Did you test a heavy late game scenario with lots of players around or something lighter? I'm asking because I found that Fortnite results can change heavily between those workloads. Ryzen 5000 was significantly (as in 20+%) faster than the 10900k in light scenarios, but in heavy late game scenarios the 10900k was within margin of error.

    • @Hardwareunboxed
      @Hardwareunboxed  2 года назад +25

      It's the replay and you can see the footage. It's mid game but doesn't really matter too much, as long as both memory types were tested in the same section, scaling will be very similar throughout the game. I'm also very familiar with Fortnite performance ;)

    • @marceldiezasch6192
      @marceldiezasch6192 2 года назад +6

      @@Hardwareunboxed Thanks. I'd love to see the game become part of your standard titles. It's still a very popular game with both a large casual and esports audience and it's the only UE5 game out right now. Sadly the constant updates complicate testing as sometimes they can have an impact on replay performance or flat out break the replay file.

    • @casperes0912
      @casperes0912 2 года назад +1

      @@marceldiezasch6192 What? Fortnite uses UE5? When did that happen?

    • @Hardwareunboxed
      @Hardwareunboxed  2 года назад +15

      @@marceldiezasch6192 the problem is the benchmark (replay) is broken with every update which is why no one bothers testing the game. You have to throw out your data each week.

    • @marceldiezasch6192
      @marceldiezasch6192 2 года назад +1

      @@casperes0912 They made the switch earlier this month with the update to chapter 3

  • @fasic46
    @fasic46 2 года назад +4

    We wait for AMD APU ddr5 use case

  • @keyserxx
    @keyserxx 2 года назад +2

    Nice round up thanks. I've never really been a ram hypetrain guy, I just buy whatever my motherboard manufactures tell me to on the HQL, test it and forget it. I can't remember what I'm on right now except its the greatest CPU ever made (3600) ;) Love the new digs.

  • @johntotten4872
    @johntotten4872 2 года назад +2

    @Hardware Unboxed Great video Steve and spot on man. Would love to see you test DDR4 4000 Mhz CL16 or CL17 memory for the upcoming memory test if you can. Some of us bought some nice memory for our last build and migrated it over to a new Alderlake build.

  • @garytrawinski1843
    @garytrawinski1843 2 года назад +1

    I liked you surprised look you had in the reflection from the RAM...LOL

    • @brandonmoore7797
      @brandonmoore7797 2 года назад

      HAha. I know. Cracked me up too... ruclips.net/video/omumzW1AtGE/видео.html

  • @johnnywoolen1453
    @johnnywoolen1453 2 года назад +17

    DDR5 does have one advantage. All of the board makers seem to have knee capped the DDR4 boards when it comes to premium features. For example, there is only one DDR4 motherboard that has a post code readout, and it isn’t available (in the US) either. So basically you don’t get that feature unless you use DDR5. You also won't get 2 PCI gen 5 slots unless you move up to a DDR5 board.

    • @moldyshishkabob
      @moldyshishkabob 2 года назад +6

      Interesting to note, though a bit of an artificial advantage.

    • @MaZEEZaM
      @MaZEEZaM 2 года назад +1

      @@moldyshishkabob I bought the Gigabyte Aorus Pro DDR4 version, it has a postcode but its a very limited board. I wanted the master but its only available for DDR5 which cannot be bought in Australia yet.

    • @SnifferSock
      @SnifferSock 2 года назад

      Who is using gen 5 slots already?

    • @johnboy2436
      @johnboy2436 2 года назад +2

      Where do you find this junk info? Even the MSI edge... A cheap 1700 socket board DDR4 board has gen 5 PCI-E.... Infact most boards im looking at have PCI gen 5 slots... Mate get outa here with your rubbish info.

    • @MicaelAzevedo
      @MicaelAzevedo Год назад

      @@johnboy2436 not pcie x16 5x

  • @Frzao
    @Frzao 2 года назад +1

    Since everyone has already commented on everything technically related, I'm just gonna say this: Trident Z RGB DDR4 design > Trident Z RGB DDR5 design.

  • @solidreactor
    @solidreactor 2 года назад +11

    [Feedback] When presenting DDR memory Speed and CAS Latency, would be nice if you could add and complement them with the calculated Nanoseconds latency.

    • @JackMott
      @JackMott 2 года назад +2

      That would be good feedback for the RAM industry!

    • @Lead_Foot
      @Lead_Foot 2 года назад +1

      First word latency is just best case scenario and doesn't tell you what latency is in real world situations. DDR5's independent channels per dimm and same bank refresh improve latency in certain situations. Intel's latency checker shows more comprehensive latency results for different workloads.

    • @nathangamble125
      @nathangamble125 2 года назад

      @@Lead_Foot Of course, but it's still useful to know it, as it's correlated with the other latency timings. That's why CL is seen as so important in the first place.

    • @solidreactor
      @solidreactor 2 года назад

      ​@@Lead_Foot Well do you think comparing CAS (tick latency) is more appropriate than time latency in nano seconds?
      Intels latency checker is not a thing you can use to add any value to a benchmark like this one, it is by itself a benchmark tool and needs its own slides or a own full dedicated benchmark video (which would be interesting by itself).
      I'm talking about adding extra info, one more variable, to an existing benchmark like this one which is the time latency in addition to tick latency (e.g. CAS).

    • @solidreactor
      @solidreactor 2 года назад

      ​@@jayden974 They are not irrelevant at all if you compare TIME latency in nanoseconds between the types of RAM. You are thinking of comparing "clock tick" latency (e.g. CAS) which is NOT what I'm referring to.

  • @lurick
    @lurick 2 года назад +1

    41 games? That's what, a half hour of work for you Steve?
    I expect at least 100 games, lol
    /s
    Great job as always HWUB!

  • @Constantin314
    @Constantin314 2 года назад +5

    i think amd was very smart not to release next gen Zen in this moment seeing what is happening to the ddr5 market. amd still has more sales than intel cause of this ddr5 situation

    • @benjaminoechsli1941
      @benjaminoechsli1941 2 года назад +1

      "Intel's got absurd amounts of money. Let's let _them_ be the beta testers for this stuff."

  • @jmashcomptech
    @jmashcomptech 2 года назад +1

    Can we get a review on that massive monitor behind you?

  • @dainluke
    @dainluke 2 года назад +5

    I appreciate your effort Steve, but I’ve been saying for a long time that the solution is simple - find two equally priced high performance 32GB kits.
    The 6000Mbps Z5s at $470 MSRP, should at the very least have been pitted against the 32GB 3800C14 kit. It just makes the most sense. This is like comparing a 3080 vs an RX 6800.
    The 3600C14 kit isn’t slow by any means, but why not just go like for like in terms of price range? I dunno for certain, but the 4000C14 32GB might actually be right at the MSRP of the Z5 6000s.

    • @ThunderingRoar
      @ThunderingRoar 2 года назад

      wouldnt 4000cl14 DR be difficult to run in gear1 on z690 since most of ddr4 boards are entry level ones?

    • @dainluke
      @dainluke 2 года назад

      @@ThunderingRoar Not necessarily. On the Asus Z690 and MSI Z690s, it shouldn’t really be an issue at all. A lot of dudes got 4000 Gear 1 working. Gigabyte is a bit of an issue.
      Also I saw your previous iteration, I mean I don’t want to sound like a dickhead, but it’s Steve from HUB. If he wanted/needed a $500 kit of DR 4000C14s, he’d get one for sure.

    • @dainluke
      @dainluke 2 года назад

      @H27W Fair point. I feel like when these comparisons are made this should be explicitly mentioned if that’s the purpose of the video.

  • @js1zzle
    @js1zzle 2 года назад +1

    I really recommend that your benchmarks include Escape From Tarkov, pretty popular and can be demanding at times. It's a technical feat

  • @SB-qo2fo
    @SB-qo2fo 2 года назад +4

    I believe that activating SAM/ resizable Bar would reduce the difference between DDR4 and DDR5 in the games showed the highest difference, like AC Valhalla. Do you confirm that steve?

    • @MrMeanh
      @MrMeanh 2 года назад +3

      It would actually be very interesting to see if this is the case.

  • @TheModeRed
    @TheModeRed 2 года назад

    That pc setup in your background is super terrific! Have you done a tour?

  • @asldfkhjaslk
    @asldfkhjaslk 2 года назад +5

    Therapist: Standing Steve doesn't exist, he can't hurt you.
    Standing Steve:

  • @tuhaggis
    @tuhaggis 2 года назад +2

    Thank you for benching SC2. I know it's not popular but it's still my main game and these videos are really helpful in making upgrade decisions.

    • @Battal_Gazi
      @Battal_Gazi 2 года назад

      Wheres SC 2? I didnt see it.

    • @Worgen4ik
      @Worgen4ik 2 года назад +2

      @@Battal_Gazi StarCraft 2 @ 7:16

  • @bibeltours
    @bibeltours 2 года назад +3

    So, you tested 41 Games. That is impressive! But if you have so much time, why not leave 1 or 2 games out and do 1 or 2 PRODUCTIVITY BENCHMARKS instead?
    I know your audience is most likely to be gamers, but I'm pretty sure that many of them do stuff besides games, like video or image editing / rendering / streaming / compiling / etc.

    • @TrueThanny
      @TrueThanny 2 года назад +1

      He already did that in the day-1 review. As he stated at the beginning of the video. This video is explicitly about getting more data on DDR5 in games.

    • @swft_
      @swft_ 2 года назад +1

      DDR5 smokes DDR4 in productivity workloads due to bandwidth, capacity and on-die ECC. In games, specially the FPS type, DDR4 benefits from having lower latency, however on other CPU heavy game types DDR5 handles them better or equal.

  • @SPACER-97
    @SPACER-97 2 года назад +1

    The b roll at 11:39 has me dying lol

  • @Ryan-ml7ws
    @Ryan-ml7ws 2 года назад +6

    It would be cool to see how heavily tuned ddr4 compares to ddr5. 4133 15-15-15-35 dual rank with tuned secondary and tertiary timings is possible on the 12900k.

    • @dainluke
      @dainluke 2 года назад

      A friend of mine and myself got lucky with MSRP Z5 at a local retailer here. My system was very borked so I haven’t had a chance to properly use it or test it, but he used the latest beta BIOS to hit 1.25V 6200 36-36-36 with relative ease.
      This makes me think 7000 is very doable with the Samsung ICs at 1.5V. I only own a Strix-F so will have to see what it caps out at at 6400, but I’m hoping for something with relatively tight timings. By that point DDR5 does significantly outpace even very good DDR4.
      My previous setup was a 5.1/4.8 10850K with 4000 15-15-15-30 DR and super tight subs. I’ll probably not notice much difference in Vangaurd but I’m expecting it to make more of a difference in games that had obvious CPU/memory limitations like Farcry 6 (stutter for days on Comet Lake).

  • @gregnezz
    @gregnezz Год назад

    Thank you very much for this and all the tech vid's. This is my go to place. Love from Ireland!

  • @mehrdadgholami5738
    @mehrdadgholami5738 Год назад +3

    Did anybody notice that spider on gskill box at 13:56 - 14:08 ? 😂😂😂

    • @heikg
      @heikg Год назад +1

      Nice eye

    • @porojandaniel5738
      @porojandaniel5738 11 месяцев назад

      Would have been nice to see a huntsman spider or a black funnel web one:))

  • @WXSTANG
    @WXSTANG 2 года назад +1

    Watching this stuff makes me happy I didn't go gang buster on the motherboard for 3rd gen Ryzen.

  • @monikaw1179
    @monikaw1179 2 года назад +3

    I'm on Ryzen so it doesn't matter to me, but the good thing with Intel using DDR5 first is that by the time AM5 launches the DDR5 market should hopefully have taken off in a bigger way (faster kits, better supply, slightly lower prices).

  • @proesterchen
    @proesterchen 2 года назад +1

    Gaming benchmarks on the quickest gaming platform? A nice surprise! 👍

  • @derptyderp5287
    @derptyderp5287 2 года назад +5

    I wonder if you can get over 60 fps in Star Citizen's Orison area with DDR5, definitely a game that accesses a lot of system memory.

    • @alexandroutsos5990
      @alexandroutsos5990 2 года назад +1

      If you don't mind me asking. What type of StarCitizen player are you? I ask because I still love the concept and scope of the game. However with the size of the team and the "update map" always significantly delayed. I wonder if Restarting development in unreal 5 is smarter at this point. 5 years delayed and 20% of content in a functional Alpha phase.
      I guess my question is. At what point will you give up on it as a pipe dream? Or will you ever give up on it.
      3 years of waiting for them to have a persistent universe and its still buggy as fuck.

    • @derptyderp5287
      @derptyderp5287 2 года назад

      @@alexandroutsos5990 I tried the free fly earlier this year, then bought the base package after that.
      I'm sceptical to say the least, but it's fun and I'd like it to get somewhere.
      Considering how long it took them to get to this point, I doubt starting over would be a good call, escpecially with them having grabbed a load of the Crytek team to work on the engine.

  • @Superiorer
    @Superiorer 2 года назад +2

    Is it me or are you guys on a massive quality content streak?

    • @Hardwareunboxed
      @Hardwareunboxed  2 года назад +7

      Yeah about 5 years now :D haha thanks mate.

  • @vibonacci
    @vibonacci 2 года назад +8

    You should run the AIDA64 memory bandwidth test. You'll instantly contact a scalper for DDR5.

    • @phoenixzappa7366
      @phoenixzappa7366 2 года назад +4

      DDR6 will be faster 😁

    • @frieza1016
      @frieza1016 2 года назад +3

      Which is pretty much irrelevant for gaming...

    • @scaryonline
      @scaryonline 2 года назад +6

      Imagine buying ddr5 just for benchmark and not for real life performance

    • @ThunderingRoar
      @ThunderingRoar 2 года назад +7

      You should run AIDA64 memory latency test. You ll instantly forget about that overpriced DDR5 kit.

    • @vibonacci
      @vibonacci 2 года назад +1

      @@ThunderingRoar Was waiting for this all too obvious rebuttal 😛

  • @Lazarosaliths
    @Lazarosaliths 2 года назад

    WoW the new setup is CRAZY!!!
    Please make a video of the set so we can put it as background to our pc monitor

  • @emilioestevezz
    @emilioestevezz 2 года назад +10

    In the context of a $2000 build, paying an extra $200 for an x% performance improvement sounds much more reasonable than saying one individual component is 2x the price of another. You should compare the relative costs of the entire system you used to generate the results as the DDR5 system does not costs 2x the price of the DDR4 one.

    • @humanbeing9079
      @humanbeing9079 2 года назад +5

      12:30 watch the video and stop misconstruing the argument because you have a hard-on for paying hundreds of dollars for bottom of the barrel DDR5.
      I doubt anyone would advocate for "future proofing" with DDR4-2133 in hindsight.

    • @adi6293
      @adi6293 2 года назад +4

      He kind of does do it with the whole bundle price, either way DDR5 is pointless at the moment

    • @Hardwareunboxed
      @Hardwareunboxed  2 года назад +5

      emilio along with everyone who upvoted his comment clearly didn't watch the entire video :D

    • @jerrynovotnik
      @jerrynovotnik 2 года назад +4

      ??? wtf.. it's surprising how stupid people can be..amazing. You know what? Buy a house together with the PC. Then the RAM price is even more negligible for x% increase..

  • @eitansharabi9488
    @eitansharabi9488 2 года назад

    I was looking around for a clear comparison and God dem did this hit the spot compared to other sources I've seen

  • @ssj3mohan
    @ssj3mohan 2 года назад +3

    DDR5 = Garbage .

  • @iseeu-fp9po
    @iseeu-fp9po 2 года назад

    Excellent as usual, Steve. You look happy in your new place.

  • @3nimac
    @3nimac 2 года назад +3

    If we're doing tests for science, I'd love to see them compared at about equal latencies in terms of nanoseconds, because presumably DDR5 will get better in the future, but at that point we won't care to bench DDR4 any more.

    • @nathangamble125
      @nathangamble125 2 года назад

      But at what frequency? Presumably you'd need to have the DDR5 running at much higher frequency, or you'd need to use very high-end DDR4 with extremely loose timings in order to run both at the same frequency. I guess maybe both at 4800 could work?

  • @SaveTheHero210
    @SaveTheHero210 2 года назад

    I can’t remember the last time I saw Steve standing up. Great video.

  • @Muldeeer
    @Muldeeer 2 года назад +5

    It would be good to test this with Escape from Tarkov..

    • @GTRWelsh
      @GTRWelsh 2 года назад

      A lot of the CPU comes from scavs and it's not easy to like for like replicate sessions exactly, but it's such a CPU demanding game man I totally agree it would be great if it would be doable

    • @devilboner
      @devilboner 2 года назад

      Is Tarkov particularly memory sensitive compared to other games?

  • @halfwayvinny9786
    @halfwayvinny9786 2 года назад

    Thank you for summing up the results in the beginning. Got the info I wanted, dropped the like. Great video

  • @Wongseifu548
    @Wongseifu548 2 года назад +8

    Feel sorry for those who paid the early adopter rax for ddr5

    • @isakh8565
      @isakh8565 2 года назад +2

      Don't, they can sell their kit for $2000 on ebay 😂

    • @noodlefoo
      @noodlefoo 2 года назад +3

      Don't be, they obviously have more money than sense.

    • @heickelrrx
      @heickelrrx 2 года назад +1

      probably they use it for work not gaming
      if they only use for gaming... Hnggggggg

    • @promc2890
      @promc2890 2 года назад +2

      @@heickelrrx even for work you need amount more than speed so they would just buy a like a 64-128gb ddr4 kit for the price of a DDR5 16gb

  • @shiishani3302
    @shiishani3302 2 года назад +2

    As always, excellent work. Thank you for keeping us informed & entertained.

  • @sixhunt
    @sixhunt 2 года назад +3

    cool i need 12900k with ddr5 and 3090 to give me the edge in quakelive.

    • @promc2890
      @promc2890 2 года назад

      Thats gotta be the dumbest purchase anyone can make for a gaming setup when 12700K + 3080ti + DDR4 kit would cost twice as less and perform only like 2-5% worse

  • @finnba_h
    @finnba_h 2 года назад +1

    What camera and lens are used for the b-roll? Outstanding looks!

  • @kjdp8519
    @kjdp8519 Год назад +1

    This video is almost a year old and I still come back to reassure if ddr5 is worth it. The only difference from then to now is the price of ddr5 + mobo cost about 200usd higher and went down to 140usd today. still a lot of money for a 3% gain. good thing I have this part 11:41 to look forward to every time I rewatch this.

  • @hhectorlector
    @hhectorlector 2 года назад

    Some great b-roll footage here! Also- kickass gaming room

  • @theigpugamer
    @theigpugamer 2 года назад +2

    This new set is beautiful Steve

  • @sorinpopa1442
    @sorinpopa1442 2 года назад +1

    Nice that u added Starcraft 2 to this bench now !

  • @Bang4BuckPCGamer
    @Bang4BuckPCGamer 2 года назад

    Must of taken a while to do this, thank you for the Data.

  • @davidbetancourt4028
    @davidbetancourt4028 2 года назад

    I would not want to be in the room when Steve found out that he couldn't recover his Halo saves. Many an F bomb were dropped I'm sure.
    So painful. My sympathies. Great review, thanks.

  • @raptor6600gt
    @raptor6600gt 2 года назад

    Cheers for including The Riftbreaker to your benchmark suite. Please include it if you do a CPU benchmark comparison in the future.

  • @crf637
    @crf637 2 года назад

    Thanks! Perfect! What I was looking for last weeks.

  • @selectthedead
    @selectthedead 2 года назад

    Steve, thats a nice aircooling setup you have for your TV there! XD

  • @captante9889
    @captante9889 2 года назад

    Wow .... nice image quality for this video!
    And I'm currently running OC'ed DDR4-3000 @ 3333/1667 CL16 on a Ryzen 5800 and am not bothering to upgrade it anytime soon.

  • @boyanmilev653
    @boyanmilev653 2 года назад

    I watched some of your videos before buying components for the new build which helped tremendously with the choice, thank you very much! In the EU, I managed to buy 64GB's of DDR4 3600MT/s memory for less than the cost of 32GB DDR5 6000MT/s memory, and that's not even accounting the extra cost for the motherboard. Even though I had no limitations on the build, it just made no sense going DDR5 right now. Glad to see the more extensive testings only prove this further.

  • @vh9network
    @vh9network 2 года назад +2

    Why did you go with the Radeon RX 6900 XT and not the RTX 3080 Ti or RTX 3090?

    • @mza2001
      @mza2001 2 года назад

      EXACTLY. Dude wasted his time

  • @youngporky5788
    @youngporky5788 Год назад

    I see lots of effort went into bringing us that data. Much appreciated !!

  • @ottomixa
    @ottomixa 2 года назад +1

    Did you use XMP on DDR5 or you have let it run on default 4800Mhz? I have not seen that mentioned in video.

  • @rwk1013
    @rwk1013 2 года назад

    Wait, Steve has legs? I didn't know that. I'm digging your big screen tv cooler on the wall. Awesome video. Glad I went with the DDR4 on my 12900k and the same Gigabyte motherboard.