Ryzen 7 7800X3D Upgrade! The Fastest Gaming CPU vs

Поделиться
HTML-код
  • Опубликовано: 6 окт 2024
  • ► Watch the FULL video here: • DF Direct Weekly #137:...
    ► Support us on Patreon! bit.ly/3jEGjvx
    ► Digital Foundry RUclips: / digitalfoundry
    ► Digital Foundry at Eurogamer: eurogamer.net/...
    ► Follow on Twitter: / digitalfoundry

Комментарии • 658

  • @MarikHavair
    @MarikHavair 10 месяцев назад +73

    Yep, got a 1700 on an X370 back in 2017, running a 5800X3D off of that now, what an upgrade.

    • @timothyandrewnielsen
      @timothyandrewnielsen 10 месяцев назад +15

      Best upgrade ever, the 5800X3D

    • @MarikHavair
      @MarikHavair 10 месяцев назад +13

      @@timothyandrewnielsen It certainly is the best in socket upgrade I've seen personally.
      AM4 is probably going to be a bit of legend in generations future. We'll be all like 'back in my day'.

    • @ekhyddan
      @ekhyddan 10 месяцев назад +6

      @@MarikHavair I hope that AM5 will have the same upgradability.

    • @dutchdykefinger
      @dutchdykefinger 10 месяцев назад +2

      i started on a r5-1600 on a b350 in may 2017 when it was pretty fresh
      i slotted in a 3700x in on the b350 last year or something :D
      not quite as bitg of a jump you got, but 28% better single threading still ain't nothing to sneeze at, doubling up that l3 cache and the seperated uncore makes it far less memory clock sensitive too, so i'm riding this AM4 bitch until the wheels fall off

    • @timothyandrewnielsen
      @timothyandrewnielsen 10 месяцев назад +2

      @@dutchdykefinger started on x370 with 1800x. Same. 5800X3D until I see CPUs come out with comparable power efficiency and double the speed.

  • @SogenOkami
    @SogenOkami 10 месяцев назад +53

    Been using this CPU for the past month on my flight sims and good lord it destroys everything I throw at it!

    • @Teramos
      @Teramos 10 месяцев назад +4

      so does my 14700k@350watt :D damn those performance/wattage is through the roof on the 7800x3d

    • @KiSs0fd3aTh
      @KiSs0fd3aTh 10 месяцев назад +1

      @@Teramos When your 14700k pulls 350 watts it's literally twice as fast than the 7800x 3d. In fact even at 200w it's twice as fast as the 7800x 3d.

    • @coffee7180
      @coffee7180 6 месяцев назад +4

      @@KiSs0fd3aTh wtf are you talking i don't get it.

    • @H786...
      @H786... 5 месяцев назад +3

      @@Teramos that is cCAPPP 7800x3d is easy to run on an air cooler lol.

    • @a-qy4cq
      @a-qy4cq 5 месяцев назад +1

      @@Teramos Your brain performance is defo not through the roof

  • @45eno
    @45eno 7 месяцев назад +5

    I went from 5900x to 5800x3D to 5800x to 7600x to 7800x3D in the last 10 weeks. Vanilla 5000 series definitely didn’t let my 6900xt breath as well compared to both 3D and 7600xt.
    Snagged smoking prices with very very little out of pocket. 7800X3D was a local buy but was sealed brand new for $340. Sold my 5800x3D for $280.

  • @i11usiveman97
    @i11usiveman97 10 месяцев назад +160

    I have the 7800x3d with a 7900xtx and Lords of the fallen still stutters when moving between areas even though it's had many patches to date. Nothing a CPU can do about a poorly optimised game engine.

    • @solo6965
      @solo6965 10 месяцев назад +4

      Nothing is perfect

    • @Denton1944
      @Denton1944 10 месяцев назад +31

      Yeah this game is just poorly optimized, it has nothing to do with CPU

    • @DeepStone-6
      @DeepStone-6 10 месяцев назад +13

      At that point blame the devs thats pretty much an apex PC configuration.

    • @S4_QUATTROPOWER
      @S4_QUATTROPOWER 10 месяцев назад +1

      Getting the 7950X3d instead

    • @100500daniel
      @100500daniel 10 месяцев назад +14

      Dev response: Obviously it's because you have a plebian 7900 XTX instead of an RTX 4090. Quit being Cheap 🫰

  • @dereksimmons1075
    @dereksimmons1075 Месяц назад +3

    I just upgraded to the 7800x3d from the i7 8700. Wow, what a difference, now I just need to upgrade my 3080, but I think I will wait till the next batch of AMD and NVIDIA cards drop before I upgrade.

    • @Racer-X-090
      @Racer-X-090 21 день назад

      I'm doing the exact same thing. Glad to know there is a big difference for my money!

  • @roki977
    @roki977 10 месяцев назад +139

    what that cpu does with like 50w is most impressive thing for me.. 5800x3d/7800xt here, great for my 240hz screen..

    • @abdirahmanmusa2747
      @abdirahmanmusa2747 10 месяцев назад +1

      What are your frames looking like?

    • @Vinterloft
      @Vinterloft 10 месяцев назад +6

      I waited and waited with a 3570K for the first Ryzen mATX X-chipset motherboard. That's the one thing I will never compromise on. It took until X570 to arrive (thanks ASRock!) and by that time I already could go for 5800X3D as my first upgrade since Ivy Bridge. 11 years, how's that for an update cycle!?

    • @roki977
      @roki977 10 месяцев назад

      @@abdirahmanmusa2747 i can get it over 200 in warzone, close to 300 in Finals beta, i am waiting for that one to come out... nothing fancy, just chaep 3600 cl 16 sticks and b550 mb...

    • @roki977
      @roki977 10 месяцев назад +1

      @@Vinterloft i am not far from that, 3770k and gtx 580 Light extreme was my last serious rig until i bought am4 about 1 and half years ago but i had 5600 and 5700x in there before x3d

    • @mangatom192
      @mangatom192 10 месяцев назад

      @@Vinterloft I got an i3 2120 before I upgraded to an am5 platform (7600). I can DEFINITELY feel the difference.

  • @mattx4253
    @mattx4253 10 месяцев назад +24

    i got a 7800x3d and 4090 and its solid as a rock so far. Crazy cool as well as my old intel and 3080 was a room heater....

    • @KiSs0fd3aTh
      @KiSs0fd3aTh 10 месяцев назад +13

      That's the kind of nonsense that is spread around. Your 3080 pulls around 320w, probably three to four times more power than your Intel did (assuming you played 1440p or 4k), your current build probably pulls even MORE power than your old one, but somehow your old one was a space heater. Right, sure buddy. What fucking ever

    • @mattx4253
      @mattx4253 10 месяцев назад +1

      @@KiSs0fd3aTh I can’t explain it other than the 10 core intel always needed max power. The 3080 aurus extreme was OC factory max and I’m now using 2 dims of ram not 4 but my room is cooler. The system doesn’t even try hard in games most of the time. Its just more efficient at hitting 144hz so your thinking is flawed as you are only concerned with max tdp

    • @KiSs0fd3aTh
      @KiSs0fd3aTh 10 месяцев назад

      @@mattx4253 I have a 4090 myself dude, both my 12900k and my 14900k are below 80w (in fact the 12900k hovers around 50-60) playing at 4k. Does you 3d pull 20w or what?

    • @mattx4253
      @mattx4253 10 месяцев назад

      @@KiSs0fd3aTh I had a 10850k clocked 5ghz all core. Pulled mega wattage. My 3080 was the extreme model. Im also not at 4k im at 1600p widescreen 38" This 7800X3d and 4090 Trio X combined just max out my 144hz in the sweet spot of the power curve which is efficient. My 10850k and 3080 were always maxed out and couldnt break 100fps in most games even turned down.

    • @KiSs0fd3aTh
      @KiSs0fd3aTh 10 месяцев назад

      @@mattx4253 So you are comparing a stock with a clocked CPU, right. Are you autistic by any chance? And of course your old build could break 100 fps, you had a 3080, lol.

  • @larrytaylor2692
    @larrytaylor2692 10 месяцев назад +19

    Doesn’t matter how good your cpu is when these devs refuse to optimize

    • @stealthhunter6998
      @stealthhunter6998 10 месяцев назад +7

      Or in the sake of unreal engine, the game engine being the problem and devs choosing it.

    • @Shieftain
      @Shieftain 10 месяцев назад +4

      @@stealthhunter6998 Allegedly UE5.4 will bring some pretty big multi-threading optimizations. But given how game development has become this generation, we probably won't see these benefits until 2025 sadly.

    • @portman8909
      @portman8909 10 месяцев назад

      Unreal Engine 5 games have all had stupid system requirements.@@Shieftain

    • @Shieftain
      @Shieftain 10 месяцев назад

      @@portman8909 Indeed, even when software Lumen is in use it's pretty steep. Disabling it boosts FPS a decent amount, but still the requirements are high. IMO games that have a large amount of static lighting are better off developed in UE4 as the older engine is in a good place now (mostly). Take Ghostrunner 2 and Lies of P for example. Both look visually solid and run extremely well.

    • @divertiti
      @divertiti 5 месяцев назад

      Nothing to do with optimization, modern games with modern visuals need more than a 6 yr old potato to run on

  • @stanisawkowalski7440
    @stanisawkowalski7440 10 месяцев назад +21

    Guys, just know how I miss your CPU reviews! Direct comparisons with framerate and frametime graphs show much more than typical AVG+LOWs graphs we see everywhere.

  • @ralphmiranda2077
    @ralphmiranda2077 10 месяцев назад +3

    I've been rocking this chip since earlier this year… I'm lovin' it!

    • @ConnorH2111
      @ConnorH2111 10 месяцев назад

      I just bought this cpu for my first pc build, cant wait!

  • @PaulRoneClarke
    @PaulRoneClarke 7 дней назад

    What I like about the X3D is its massive impact on games with many AI objects like Dwarf Fortress, or RTS games with programmable unit movement/types/instructions. Games where the AI has to apply rules iteratively across perhaps thousands of entities/units per game cycle,
    That extra cache helps this sort of game immensely. So it's not just for 3D action games. The improvement is seen across just about all types of games.

  • @mrjiggs8
    @mrjiggs8 10 месяцев назад +18

    I went with the 7800x3d awhile ago.... what's impressed me most.... the reduction in my electric bill.... amazing the power per watt with it and my 7900xtx.

    • @TheJonazas
      @TheJonazas 10 месяцев назад +3

      If you wanted the ultimate reduction in power while maintaining high fps, then the 7800x3d with a 4090 would be much better than your 7900xtx. The 4090 is so efficient when undervolted.

    • @Madhawk1995
      @Madhawk1995 10 месяцев назад +3

      @@TheJonazasas typical of amdTards, they complain about intel cpu power draw while their rdna3 gpu pull that much in difference simply idling. 7800x3d + 4090 is the most power efficient setup.

    • @Jasontvnd9
      @Jasontvnd9 10 месяцев назад +13

      @@TheJonazas Yeah takes a while to make up for the extra $1000 spent on the card in the first place though doesn't it.

    • @BlacKi-nd4uy
      @BlacKi-nd4uy 10 месяцев назад +1

      i changed from an undervolted 13600k with 4800/3000mhz allcore setting with 3600 cl14 ram to a 7800x3d co -30 5800 cl28 setup. around +50% performance by -50% powerconsumption. the powerconsumption was the biggest hit for me. to be fair, under fps lock and gpu limit both have enough power and the powerconsumption is only a little lower on the 7800x3d. only in cpu limiting szenarios the powerconsumption difference is big.

    • @pakjai5532
      @pakjai5532 10 месяцев назад +2

      ​@Madhawk1995, going the 4090 route is like paying for your electricity bills in advance.😂

  • @dotxyn
    @dotxyn 10 месяцев назад +72

    Enabling EXPO at 6400 may have halved the unified clock reducing performance. Check ZenTimings and make sure the UCLK matches the MCLK. If not, you need to adjust the UCLK setting to UCLK=MCLK (not /2). Hopefully your X3D will be able to run 6400 UCLK 1:1, otherwise you may need to drop down to 6200 or 6000.

    • @the.wanginator
      @the.wanginator 10 месяцев назад +18

      You beat me to it. Even AMD states that 6000 CL30 is the sweet spot. Also, make sure the chipset driver is installed.

    • @xouri8009
      @xouri8009 10 месяцев назад

      Isn’t he using DDR4? Or did I misunderstand?

    • @erickelly4107
      @erickelly4107 10 месяцев назад +2

      @@the.wanginator
      This doesn’t mean than 6200+ MT/s is somehow “worse” higher will be better as long as you can run it.

    • @mugabugaYT
      @mugabugaYT 10 месяцев назад +7

      @@xouri8009 7800X3D only supports DDR5

    • @arc00ta
      @arc00ta 10 месяцев назад

      Also, you want to make sure your board didn't ramp up the VDDP with the profile. Asus and MSI are pretty bad with this, thats the DDR5 I/O and anything over 1.000V (default AMD setting is 800mv) can make it super unstable without crashing.

  • @gibbles3216
    @gibbles3216 10 месяцев назад +26

    The parallelization is the hard part. You almost have to make a script that will look for CPU's and create profiles based on the core and thread count. This will take time. This is where AI would be very helpful. It can be in the background optimizing threads for a 4, 6, 8, 12, and 16 core CPU. It would run a check to see the core count and execute the the script to start the threading in Windows scheduler or a pre-baked in engine scheduler. This way tasks can be offloaded to various threads based on the criticality. For instance, a thread may just be used to complile shaders in the background constantly. Future engines will need to focus on multi-core threading in earnest. You can have the fastest IPC on the planet. It will mean nothing if the engine cannot properly use most, if not all the cores and threads at its disposal. I have a 5950x. In newer games it is being used to at 7-20% based on what is enabled. I know it is not the fastest CPU on the market in terms of IPC. But, if utilized properly is still very fast. TLDR, the CPU's are not really the problem unless they are budget level CPU's. The utilization of them is and will be till engines are centered around getting the most out of them.

  • @icky_thump
    @icky_thump 10 месяцев назад +42

    Just upgraded to this when MicroCenter put it on an early Black Friday sale for only $299! It wuickly went back up to $369 like a day later 😅. So far so good. My games 1% lows seem a lot stronger, GPU stays near 99-100% more often, and strangely it runs cooler than my 7700X.

    • @phantom498
      @phantom498 10 месяцев назад

      What’s your gpu?

    • @DarthSabbath11
      @DarthSabbath11 10 месяцев назад

      Just bought mine for 398, ffs 😔

    • @ionnicoara6657
      @ionnicoara6657 10 месяцев назад

      You can ask for a price match. I asked newegg. They didn't give 299 but went to 349

    • @andrewmorris3479
      @andrewmorris3479 10 месяцев назад +7

      Firstly you scored on that ridiculous price! Secondly, the 7800X3D pulls about 30-40% less wattage in gaming and almost 50% less in multi core workloads.

    • @BiscuitMan2005
      @BiscuitMan2005 10 месяцев назад +1

      Wow and I thought 369 was a deal from them when I got it last week there. I have the worst luck

  • @AshtonCoolman
    @AshtonCoolman 10 месяцев назад +10

    I have a 7800X3D, RTX 4090, and CL30-36-36-76 DDR5 6000 all on an ASUS X670E Crosshair Hero and my system is flying.

    • @baghlani92
      @baghlani92 10 месяцев назад

      I had cl30 38 38 96 is it good one?

    • @Strengthoftenmen
      @Strengthoftenmen 10 месяцев назад

      I have the exact same set up 7800x3D, 4090, x670e hero. Buuut Initially I was running corsair vengeance 5600 but the system kept crashing. Just purchased some G skill 6000hz cl30 hoping it will improve things.

    • @AshtonCoolman
      @AshtonCoolman 10 месяцев назад

      @@Strengthoftenmen Have you updated your BIOS to the latest version? It's a necessity on these boards. The BIOS this board shipped with has unstable DDR5 support and also will kill your 7800X3D due to too much SoC voltage.

    • @Strengthoftenmen
      @Strengthoftenmen 10 месяцев назад

      @@AshtonCoolman thanks for the reply. Yes I have updated the bios. I think the mb is just very sensitive to timings??

    • @FearlessP4P1
      @FearlessP4P1 10 месяцев назад

      @Strengthoftenmen how has the new ram been working? I’ve heard a few issues Corsair and another ram, but I’ve yet to hear any issues with Gskill

  • @SvDKILLSWITCH
    @SvDKILLSWITCH 10 месяцев назад +3

    Definitely agree with Richard on the 5800X3D. I think it's worth keeping in mind that AMD tried several times to break forwards compatibility on AM4, but the whole platform is better off for their eventual concession and extension of AGESA support to 300-series boards. There are other limitations to consider when using such an old board (limited PCIe speeds and expansion being the main one), but I was able to drop a 5800X3D into the same board I purchased back in March 2017 with a Ryzen 7 1700 - nothing short of a gigantic increase in CPU and especially gaming performance. Newer CPUs and BIOS updates also dramatically increased RAM compatibility to the point where I've gone from having to downclock a two-stick RAM setup from 3200 to 2666 to get stability all the way to four sticks at a lazy 3600 CL16 with headroom for lower timings.

    • @PeterPauls
      @PeterPauls 10 месяцев назад +1

      I did a similar upgrade. I have a MSI B450 Tomahawk Max motherboard with great VRMS, it had a Ryzen 7 2700x in it and I upgraded to a Ryzen 7 5800X3D which was a huge performance leap. My only problem that is only a single M.2 PCI-E 4x slot, so I use a 2TB SSD in it. I put an RTX 4080 next to the CPU only PCI-E x16 3.0 but so far no PCI-E bandwith bottleneck and the CPU is also doesn’t bottleneck the GPU.

    • @Solrac-Siul
      @Solrac-Siul 10 месяцев назад +2

      it is a mixed bag in all honesty. While we do got - because i kind of did the same - a new sparky cpu, we miss on other amenities that a 2017 board like mine didn't had compared with a 2020 or 2021 board, what means that we actually do not take full advantage of the cpu... we get close, maybe 90 to 95% of the real potential.Eventually for professional reasons in early 2022 (February) I actually needed more m.2 4x and because of that , on that specific timeframe, I had to go intel way since the z690s had been out and had 4x m.2 4x slots - and AMD's upcoming platform was by then still multiple months away. So sometimes we do value too much cpu socket upgradeability since while it is nice some 4 years down the road to have an upgrade, in 4 years multiple things moved on and we end lagging on multiple aspects , and that depending on each own's personal needs, can vary from a minor inconvenience to actual time loss , and that is bad when time=money.

  • @t3chn0m0
    @t3chn0m0 10 месяцев назад +11

    If you run the memory at 6400MT, please make sure that its running FCLK/MEM Div 1:1, otherwise it's 1:2 and you lose quite some performance.

    • @JFinns
      @JFinns 9 месяцев назад +3

      Great tip, this is why even AMD recommends 6000 MT/s CL30 ram for these CPUs.

  • @The_Man_In_Red
    @The_Man_In_Red 10 месяцев назад +11

    Would be really great to hear you guys talk about cache-coherent interconnects like CXL providing high-bandwidth PCIe at extremely low latency and if/when we might see this tech migrate from servers to mainstream and the affect it could have on gaming and computing.
    A few years ago we heard all about the PS5s I/O accelerator and how the Oodle Kraken & texture decompression was a game-changer but we've yet to see anything like it on PC (Windows crappy implementation of DirectStorage does not count, in my opinion)
    What are you thoughts on this?

  • @clutch7366
    @clutch7366 10 месяцев назад +1

    Disabling smart access memory helps sometimes, like it did for me in Warhammer Darktide.

  • @that_colin_guy
    @that_colin_guy 10 месяцев назад +3

    3:15
    Displayport Superiority 🤘🏻

  • @12me91
    @12me91 10 месяцев назад +17

    While I would have liked to have seen the 5800x3d be used in its day I'm loving the fact that you'll be using a 3d chip finally

    • @selohcin
      @selohcin 10 месяцев назад +1

      Took the McDonalds commercials a little too seriously, did ya? 😂

    • @SpecialEllio
      @SpecialEllio 10 месяцев назад +1

      @@selohcin loving something isn't a commercial exclusive and he didn't even say the mcdonalds quote

    • @puregarbage2329
      @puregarbage2329 10 месяцев назад +4

      Hardware unboxed used it in almost all of their benchmarks when it was still relatively new. It’s conclusively better than 10900k, and pretty close to 12900k, sometimes slightly better. 7800x3d is easily the best gaming cpu when you consider the price.

  • @SanderSander88
    @SanderSander88 10 месяцев назад +1

    Just bought my pc upgrade, its coming today, got it for 350€ , so thats a good deal for where i life. So excited to get building.

  • @prototype8137
    @prototype8137 10 месяцев назад +3

    Jeez. The star citizen footage is like night and day. 😅

  • @elbowsout6301
    @elbowsout6301 10 месяцев назад +1

    I just built a new rig 6 weeks ago with a 7800X3D and an RX 6950 XT and I couldn't be happier. It's been a few years since I have built a PC from scratch and the build process was seamless. Go treat yourself to some 6000 MT/s RAM.

    • @godblessusa1111
      @godblessusa1111 9 месяцев назад

      weak card for this processor...need min rtx 4080

  • @Kapono5150
    @Kapono5150 10 месяцев назад +2

    Just picked up the 7800X3D. Shout out to Intels “Core Truths”

  • @puregarbage2329
    @puregarbage2329 10 месяцев назад +5

    Amd is the 🐐 for cpus, I don’t care what anyone says. 5800x3d going strong almost 2 years later. And only at 100watts 👀

    • @Hi-levels
      @Hi-levels 10 месяцев назад

      Am4 is going strong as well.

    • @dvxAznxvb
      @dvxAznxvb 10 месяцев назад

      Yeah 3d cache is barely bottlenecked at this point in time

  • @elvertmack5039
    @elvertmack5039 10 месяцев назад +3

    I went AMD...but only half way. Got a 12900k combo deal with cpu,motherboard, and ddr5 6000mhz ram for like 400 bucks whole combo and paired it with a 7900xtx and I tell you...this combo is butter smooth. I also have a 13700k/4080 and it must say that the 7900xtx system feels smother than my 4080 system. Both are on a 34 inch Alienware oled monitor at 1440p and getting beautiful results.

    • @VYM
      @VYM 10 месяцев назад

      Worst combo ever. Extremely power hungry CPU and same GPU together is a big fail. You should've bought 7800x3d and RTX4070 with 550 watts PSU.

    • @mttrashcan-bg1ro
      @mttrashcan-bg1ro 9 месяцев назад

      @@VYM Most people don't care about that. I'd personally rather AMD CPUs in my system with a 4090 so I don't risk saturating my 1000w PSU, but if I did go Intel, I wouldn't care that much about the extra 100w. Most people just wanna run games and get good performance with manageable temps. Despite my 5900X only getting a max of 145w under full load, it's so ridiculously hot anyway

    • @VYM
      @VYM 9 месяцев назад

      @@mttrashcan-bg1ro I cut down my 14700k turbo boost power from 253/135 to 110/110 and all core boost ratios from 55/56 to 42. Lost 14% performance but temperatures now not over 65-70 degrees with 26000 in Cinebench dropped from 30500.

  • @razzorj9899
    @razzorj9899 10 месяцев назад +3

    I just upgraded from a ryzen 5600 to the 7800x3d and wow what an upgrade more then doubled my fps in warzone 2 and mw3 and I play at 3440x1440 so yes cpu matters at higher resolutions as well

    • @mttrashcan-bg1ro
      @mttrashcan-bg1ro 9 месяцев назад

      CPU matters at 4k especially with a 4090, my 4090 is pretty much never at 99% usage on any game at 4k Ultra, and RT makes matters worse because a 5900X just can't do it. But seeing a 7800X3D and 14900K on benchmarks, they only do like 30% better which isn't worth spending another $2000AUD just yet.

    • @xbox360J
      @xbox360J 8 месяцев назад

      Same here !!! That cpu upgrade has been awesome along with my 3080.

  • @RickGrime80
    @RickGrime80 20 часов назад

    The fact that this cpu only draws 38-60 watts while gaming its just amazing

  • @bmw523igr
    @bmw523igr 10 месяцев назад +3

    Great timing. Yesterday changed my platform from AM4 5600 to AM5 7800X3d with 7800XT. Setup is not complete(missing one more M2 drive), but I can't wait to use it.

    • @ZackSNetwork
      @ZackSNetwork 10 месяцев назад

      You are GPU bound however nice CPU.

  • @Cernunn0s90
    @Cernunn0s90 10 месяцев назад +11

    I can't stand stutters, and recent years with shader compliation stuttering in so many games is honestly driving me nuts. Something has to be done.

    • @stealthhunter6998
      @stealthhunter6998 10 месяцев назад +8

      It’s not just shader it’s traversal stutters which r even worse. Straight up stutters all the time not just one time. Like dead space remake. 100% the worst part of PC gaming rn.

    • @liftedcj7on44s
      @liftedcj7on44s 10 месяцев назад +1

      It has made PC gaming no fun for myself, the way these developers are coding these games is just stutterfest everywhere.

    • @michaelzomsuv3631
      @michaelzomsuv3631 10 месяцев назад +2

      ​@@stealthhunter6998 Dead Space original can still be played. I played it recently, it was the smoothest experience with constant 240fps not a single frame drop on a 240hz monitor. Way more fun than any shitty lagfest that are 'modern' games and remakes.

    • @stealthhunter6998
      @stealthhunter6998 10 месяцев назад +1

      @@michaelzomsuv3631 Ik… the quality of older games is just the best. Was playing Halo CE in OG graphics again and was amazed by the fact they had RTX like effects in the glass. U can see chief in the reflection of the floor in the control room crystal clear. Sure it’s dated but for back then that’s a huge achievement real time lighting effect. Just so much care and effort put into them compared to today’s stuff.

  • @rob4222
    @rob4222 10 месяцев назад +1

    I want more CPU benchmarks. CPUs do matter too

  • @spiritualantiseptic
    @spiritualantiseptic 10 месяцев назад +43

    I jumped from R5 7600 to 7800X3D solely because DF made me conscious of stutters.

    • @BigGreyDonut
      @BigGreyDonut 10 месяцев назад +5

      I made that exact same jump bc I had a “It’s time” moment once it went on sale again

    • @spiritualantiseptic
      @spiritualantiseptic 10 месяцев назад +10

      @@BigGreyDonut To be fair though the performance jump isn't actually worth the price because R5 7600 is a great value but man, the DF is helping hardware producers 😂

    • @BigGreyDonut
      @BigGreyDonut 10 месяцев назад

      @@spiritualantiseptic hey now…don’t actually use reason🥲

    • @DeepStone-6
      @DeepStone-6 10 месяцев назад +10

      I got the 5800X3D because it ran Star Citizen the best out everything out at the time but then I realized every other game I play ran much smoother so now if it doesn't have 3D at the end I don't want it.

    • @nightspiderx1916
      @nightspiderx1916 10 месяцев назад

      @@DeepStone-6 I had the 5800X3D and also the 7800X3D but i might stay with the Zen5 without V-Cache because the Zen architecture likes tuned DRAM.
      In Star Citizen a tuned Zen4 is almost faster than a Zen4 with V-Cache because some games want the +600 Mhz and for some games the additional 64MB L3 is not enough.
      There are testers out there on RUclips who compared the latest intel and amd CPUs with tuned DRAM and with tuned DRAM Intel is often ahead, like in Star Citizen.
      So i think about buying the ""vanilla"" zen5 and tune it as good as i can with DRAM 6400 and beyond (subtimings).
      ruclips.net/video/HzPrGv22R7g/видео.html

  • @FooMaster15
    @FooMaster15 10 месяцев назад +2

    Some problems you just can't brute force with sheer CPU power... I hope UE5 gets more CPU optimized soon.

    • @mttrashcan-bg1ro
      @mttrashcan-bg1ro 9 месяцев назад

      UE5 doesn't seem to have CPU problems as much, it scales pretty well with core/thread count and speed/IPC, the UE5 games are just really GPU heavy. A lot of these newer games with CPU problems such as Jedi Survivor are still using UE4, the ones on UE5 that are CPU dependent probably just want faster chips

  • @badwolf2185
    @badwolf2185 Месяц назад

    Hopefully the new XeSS that intel is working on allows faster shader compiling.

  • @f-35x-ii
    @f-35x-ii 10 месяцев назад +6

    I'm still going strong with my 5800x3d , and i even undervolted to -30 per core and wow temps are good and performance is top notch still! Im not gonna upgrade cpu for maybe 8 years, and get whatever is top price to performance then. (Edit from all core to per core -30 undervolt)

    • @devilmikey00
      @devilmikey00 10 месяцев назад +1

      AMD CPU's are great in that undervolting usually increases performance. I had a 5700x for a bit and using PBO and an undervolt got me 250mhz more boost while running cooler. I know the X3d parts can't raise their boost but I bet it's staying at it's max boost longer and more consistently now that you've done that.

    • @f-35x-ii
      @f-35x-ii 10 месяцев назад

      @@devilmikey00 correct, mine stays at max boost all the time it doesn't even go down at all while gaming, on was locked in earlier BIOS, but with newest BIOS it is easily changed!

    • @teddyholiday8038
      @teddyholiday8038 10 месяцев назад +2

      In running all core -30 on my 5800x3D too
      It’s a beast CPU but I don’t think it’s gonna last me 8 years. There are plenty of recent games just hammering CPUs
      I’m skipping Zen4 but I’ll consider whatever the Zen5 equivalent is (9800x3D?)

    • @exoticspeedefy7916
      @exoticspeedefy7916 10 месяцев назад +2

      Per core is better, otherwise you could actually loose performance if you have too much of a negative offset

    • @f-35x-ii
      @f-35x-ii 10 месяцев назад +1

      @@exoticspeedefy7916 oh sorry , that is what I meant, per core, just double checked my BIOS

  • @jadedrivers6794
    @jadedrivers6794 10 месяцев назад +2

    Being able to run a 5800X3D on a motherboard from 2017 is amazing! Getting a 35% increase in fps going from a R5 3600 to a 5800X3D on 1440p (source: HUB) is mind boggling

    • @markotto1767
      @markotto1767 10 месяцев назад

      With the same gpu ? What gpu do you have ?

    • @jadedrivers6794
      @jadedrivers6794 10 месяцев назад +1

      @@markotto1767 I have a 7900 XT. HUB tested with a 6950 XT. There is around a 15% performance gap between the two cards but the results should be very similar.

  • @MrErball
    @MrErball 10 месяцев назад +6

    Every now and then I'm reminded how much these guys are out of the loop when it comes to hardware. They're the definitive tech-tubers when it comes to graphics technologies and gaming, but the hardware driving it is as much black-magic to them as it is to most enthusiasts.

  • @kybercrow
    @kybercrow 8 месяцев назад

    Yeaaaaah, the point they made in the first 5 minutes was very apt. People are very quick to blame others' hardware, and for some reason hardly ever want to attribute major issues to game optimization. Yes a better CPU will give better performance, as this video shows, but people are very hyperbolic on this point-big issues are likely to be the game.
    I have the 7700X. Avatar runs like butter on ultra settings with the occasional hitch here & there, but I get frequent stutter in CP 2077 after patch 2.1.

  • @TheRealDlo
    @TheRealDlo 10 месяцев назад +5

    Gonna get a lot of people really in thier feelings because of brand loyally 😆
    Listen, OVERALL, MOST people will be happy with a 13600k-14900k, 7or a 7700x-7800x3d.
    Get rid of all this best 1% BS. MOST people will gain NOTHING from the extra 2-5 fps
    I owned a 13600k. It was GREAT!
    I now use a 7800x3d because I wanted to build AM5 & see how it does.
    Guess what, I prefer the 7800x3d even after weeks and weeks of testing.
    I think building TODAY for a average consumer, AM5 makes more sense. IMO
    For PROS? #1 they dont care & have others build for them. And if money & power isnt an issue, specifically for esports, the 14900k is prob the "best"
    But anyone reading this really shouldnt care
    Do what u want, spend your money how you want.
    Just dont be a closed minded elitest prick. MANY people DO care about budget.
    I digress ✌️

  • @awdrifter3394
    @awdrifter3394 10 месяцев назад +3

    It's kind of disappointing that AMD didn't push the 7800x3d to 10 or 12 cores.

    • @mttrashcan-bg1ro
      @mttrashcan-bg1ro 9 месяцев назад

      Since they've managed to get 16 core CCXs on the newest I think Threadripper CPUs, I'd imagine the 8950X will be just a single CCX, but who knows. Either they ditch the CCX, or they increase the core count. If we've got another 2x8 core CPU I think people are gonna be pretty mad. I've got a 5900X and I'm hoping to upgrade to an 8950X or 15900k and the choice to go Intel will be super easy if AMD decides to not do anything besides increased IPC and clockspeed. They've not increased cores on the consumer Ryzen CPUs since Ryzen 3000 back in 2019, back then, 16 cores was a lot, now it's not really a lot, it's just a good amount.

    • @everope
      @everope 9 месяцев назад

      8000 series is gonna be laptop and 8000G APUs... So next gen will likely be called 9950x

    • @awdrifter3394
      @awdrifter3394 9 месяцев назад

      @@mttrashcan-bg1ro They could've done dual CCX with 6 cores each.

    • @danijelb.3384
      @danijelb.3384 8 месяцев назад

      Why would they? They want us to upgrade to a 12 core x3d chip. They’re going to milk this, understandably

  • @Train115
    @Train115 10 месяцев назад

    Ive been using one for a few months now, its strong. :)

  • @mypeeps1965
    @mypeeps1965 9 месяцев назад +4

    I have both intel and amd high end gaming rigs side by side and i enstictively drawn to my intel rig. I have upgraded every year since core duo and athlon but since ryzens inception til now every amd system has had gremlins/phantom issues that i have never been able to solve. Note: i owned a brick and mortor pc shop for 30 years and have no horse in the cpu and gpu race. I believe all the issues with ryzen is due to memory and or the memory controller but i have no proof just decades of experience. I thinks the memory controller overheats and gets buggy buts thats just my opinion. As far as gpus go i prefer Nvidia overall but that mostly depends on who's manufacturing the gpu and the class of card and resell value. Take care.

  • @benknapp3787
    @benknapp3787 8 месяцев назад +1

    I have this CPU coming later this week along with a 7900XTX for my very first build. Should be fun!

    • @keynoteFlyer
      @keynoteFlyer 3 месяца назад +1

      I’ve just ordered the same combo , how’s it been?

    • @benknapp3787
      @benknapp3787 3 месяца назад

      @@keynoteFlyer It's been absolutely amazing. I run a 65" LG C2 and two 24" Dell side monitors. I run every game I play at 4K Max settings and it crushes them.

    • @keynoteFlyer
      @keynoteFlyer 3 месяца назад +1

      @@benknapp3787 thanks for the quick reply Ben. That sounds amazing. My setup is mainly for flight sim and a few Rockstar games. I’m looking forward to this massive upgrade from my old i7 with 6600XT. 😀👍🏻

    • @benknapp3787
      @benknapp3787 3 месяца назад

      ​@keynoteFlyer You won't be disappointed. What version of the 7900XTX did you order? I'm running the Merc 310. It hits about 450w with the power limit raised but it's worth the extra performance in demanding games.

    • @keynoteFlyer
      @keynoteFlyer 3 месяца назад +1

      @@benknapp3787 I’ve ordered the ASRock 7900 XTX Taichi edition. Both of my previous cards have been Sapphire Nitro + versions but I can’t find any of those in the 7900XTX. The Taichi sounds like a decent equivalent. What Power Supply did you go for? The AMD website recommended 1000W but the builder wanted to give me 850W (based on the lower power draw of the 7800X3D I think)

  • @nemz7505
    @nemz7505 9 месяцев назад

    The 3DNow patch for the original Freespace was pretty impressive to be fair ;)

  • @dante19890
    @dante19890 10 месяцев назад

    1.45 . Johns delayed laugh cracked me up XD

  • @kapono2916
    @kapono2916 10 месяцев назад +1

    Digital Foundry switches from Intel to AMD, wow that’s BIG news

  • @J_Brian
    @J_Brian 9 месяцев назад

    What helped me is upgrading my RGBs to OLEDs

  • @fracturedlife1393
    @fracturedlife1393 10 месяцев назад

    ACC massively prefers the 3d vcache too. Many games have crazy uplift, and almost as many others would prefer non X3D.

  • @Moshenokoji
    @Moshenokoji 10 месяцев назад

    The 7800X3D's power efficiency is just ridiculous. Less than 60 watts for the whole package while playing games. One of the biggest drawbacks is the lower clock, a 7700X will win in certain tasks, like shader compilation.

  • @vulcan4d
    @vulcan4d 10 месяцев назад +3

    I hope you guys do memory tuning and compare. It is amazing how close the 7700x is to the 7800x3d is when you have tight memory timings. The secret sauce is that the 7700x has better latency up to 16mb cache but above the 7800x3d wins however with tight memory timings it isn't that much of a difference any more.

    • @JsGarage
      @JsGarage 10 месяцев назад +2

      But what’s the point? Pay a little more get an overall faster CPU for gaming and don’t have to mess with memory tuning or risking corruption when you don’t get your timings just right. Guess if you love tweaking get a 7700X and feel special that you got there differently. For most people it’s a waste of time for what you’d save tweaking memory on 7700X instead of 3D route.

  • @ymi_yugy3133
    @ymi_yugy3133 10 месяцев назад

    It kind of depends what's causing the stutter, but it's not impossible that even a slightly lower class CPU has massive stutters where a faster one doesn't.
    Imagine something like memory management. As you play you keep accumulating garbage that needs to be freed. If the CPU is idle, it runs the garbage collection, if it doesn't it delays the task.
    A fast CPU might have enough capacity to run the garbage collection before having to render the next frame, but slower one delays it until it goes over some memory limit and then suddenly has a much more expensive garbage collection process. This is of course a contrived example and no sane programmer would program it like this. But when you have many, many tasks, weird things happen.

  • @georgeindestructible
    @georgeindestructible 10 месяцев назад

    11:57 - 12:57 yet that's literally a CPU killer right there, my 5800x was struggling so hard there.

  • @CruitNOR
    @CruitNOR 10 месяцев назад

    8:26 Mate the difference is literally night and day!

  • @Nintenboy01
    @Nintenboy01 10 месяцев назад +2

    Based on leaks/rumors Zen 5 may not be a massive leap in terms of IPC - Zen 2 to 3 was apparently a bigger jump. If you're already on Zen 4 it seems more prudent to wait for Zen 6 or even 7 or see how Intel's 15th or 16th Gen do

    • @_N4VE_
      @_N4VE_ 10 месяцев назад

      I have a 79xtx and been debating on upgrading to the 7800x3d from my 5800x3d. After many videos it seems it’ll get me roughly 15-20fps gain in general, some games don’t show much improvement at all. After pricing everything I need, i’ll be spending $725.00. Is that even worth it? Or should i just stick to my 58x3d until next gen?

    • @Nintenboy01
      @Nintenboy01 10 месяцев назад +1

      @@_N4VE_ probably best to skip at least 1 Gen. Maybe wait for the 8800X3D

    • @_N4VE_
      @_N4VE_ 10 месяцев назад +1

      @@Nintenboy01 Thanks, makes sense. Id rather spend that kinda money and get a good jump in performance rather then a minor one. Guess i’ll just have to live with it for now, 58x3d still a great chip and is doing well with the 79xtx.

    • @Nintenboy01
      @Nintenboy01 10 месяцев назад +1

      @@_N4VE_ yeah, in many games it's also almost as good or even slightly better than the 12900K

    • @_N4VE_
      @_N4VE_ 10 месяцев назад +1

      @@Nintenboy01 glad i looked into it more and got second opinions. I almost impulse bought an upgrade that I don’t really need lol. The extra frames would be nice but i don’t see it justifying dropping $700< for it

  • @jergernice1
    @jergernice1 10 месяцев назад +2

    just built my 7700x seems to work great. have no idea what scheduling hes talking about adjusting?

    • @gellideggti
      @gellideggti 10 месяцев назад

      It will not be a problem for the chip you have mate its more for the 16 core duel ccd. one with 3d one without . The use the Microsoft game stuff to sort what half of the chip to use. 3d half or standard half.

  • @guntherultraboltnovacrunch5248
    @guntherultraboltnovacrunch5248 9 месяцев назад +2

    As a guy who lives in the Arizona desert, I'm just happy with how it sips power and runs cool.

    • @M41NFR4M3Gaming
      @M41NFR4M3Gaming 8 месяцев назад

      I live in Arizona too. I’ve had both. My 14900k runs cooler than the 7800x3d. Returned the 7800.

  • @badwolf2185
    @badwolf2185 Месяц назад

    Star Citizen would have had a 50% difference if both would have been day vs night.

  • @Typhon888
    @Typhon888 10 месяцев назад +5

    Every real hardware enthusiasts already know ow X3D is just hype. It’s good at 10 games with a $2,500 GPU… then look at the multithreaded performance is worse than my 11900k from 2019.

    • @mitchmurray2260
      @mitchmurray2260 9 месяцев назад +1

      exactly... its strictly a 1080p chip to run up the fps

    • @evilleader1991
      @evilleader1991 9 месяцев назад

      lol what?? x3d really improves the 1% and 0.1% lows lmao@@mitchmurray2260

  • @Dex4Sure
    @Dex4Sure 9 месяцев назад

    6:08 I also use this scene to test CPU single thread performance in games. Crysis engine is just very single thread heavy and that scene just always maxes out single thread and creates huge GPU bottleneck.

  • @yousuff1
    @yousuff1 10 месяцев назад +1

    With curve optimizer you can make the 5800X3D/7800X3D incredibly efficient.
    -30 all cores for me, rarely breaks 50c in games.

    • @exoticspeedefy7916
      @exoticspeedefy7916 10 месяцев назад +1

      Better to test Per core. You can loose performance if you have too much negative offset

    • @yousuff1
      @yousuff1 10 месяцев назад

      no loss in performance or stability, i ran the necessary tests for longer than suggested. The 5800x3d has been binned quite well, loads of people manage -30 all core. @@exoticspeedefy7916

  • @Battler624
    @Battler624 10 месяцев назад +2

    I seriously doubt what a lot of comments online say.
    A lot of people dont see stutter.

  • @asadasada-e6s
    @asadasada-e6s 10 месяцев назад

    It is really a generational difference and you've shown that.

  • @FNXDCT
    @FNXDCT 3 месяца назад

    i am very surprised. I thought my 5800x3d was stuttering with shaders but i notice even the lastest cpu do ! :o
    The games should find ways to preload the shaders before.
    The last of us part 1 was preloading shaders and i can tell you with 5800x3d and 4060ti i've had no stutters at all, it was buttery smooth.
    Unreal engine 5 games are the worst about stutters for now, they are stuttering very hard on nvida cards, especially when you start the game and the next 30minutes of gameplay

  • @mattio79
    @mattio79 10 месяцев назад +1

    For the 7800X3D, the optimal RAM configuration is DDR5 6000 CL30. MANY review outlets have confirmed that it's the sweet spot for the Zen 4 parts, regardless of the current or previous AGESA updates. Of course, if there are kits with more bandwidth and the same timings as the 6000 CL30 kit, those would supersede.

    • @tbunating1451
      @tbunating1451 10 месяцев назад +2

      That’s what I have paired with my 7800x3D, no problems in the few months I’ve had it

  • @supraken123
    @supraken123 10 месяцев назад

    Go on lads.

  • @tecnocracia09
    @tecnocracia09 10 месяцев назад +1

    Looks like this guys does not follow hardware unboxed. The 7800x3d the optimal memory config is 6000 MHz cl30, more than that is a waste

  • @Texshy
    @Texshy 10 месяцев назад

    One game that has MASSIVE benefits from the X3D cpu's is Squad.

  • @AdadG
    @AdadG 10 месяцев назад

    "[...] but it's not like, it's not a generation little difference" Alex said. But... what generation are we talking about, from the 10th to the 11th or from the 13th to the 14th generation of Intel? If so, there are like 5 generations of those together.

  • @JayzBeerz
    @JayzBeerz 10 месяцев назад +5

    And next year we'll have the 8800X3D

    • @Aggrofool
      @Aggrofool 10 месяцев назад +1

      Next year is non-X3D Zen 5. X3D will be 2025.

    • @JayzBeerz
      @JayzBeerz 10 месяцев назад

      @@Aggrofool What?

    • @Aggrofool
      @Aggrofool 10 месяцев назад

      @@JayzBeerz 8800X3D will likely be 2025. The CPU without 3D cache will be introduced first in 2H 2024, with the X3D version many months later.

  • @JoePolaris
    @JoePolaris 10 месяцев назад

    Have guys look at ram tuning , matching infinity cache speed

  • @roymarr6181
    @roymarr6181 9 месяцев назад

    should be comparing the Intel i9 13900k seems how it came out close to the same time as the Amd Ryzen 7 7800x3d, and i bet you the test would be closer, and last to benchmark cpu's you don't bench at 4k, you bench at lower res like 1080p etc

  • @marcotomiri3440
    @marcotomiri3440 10 месяцев назад +1

    i need the 7950x 3d for cities skilines 2, 7800 has too few cores for that game in late game, with more than 500k people

  • @candidosilva7755
    @candidosilva7755 10 месяцев назад +2

    Next try Chernobylite. Just for fun.

  • @freezxng_moon
    @freezxng_moon 10 месяцев назад +2

    I am upgrading to 7800x3d from an 8700k.. I only have a 4070 ti and play on 1440p and CPU wasn't a problem until recently where it started bottlenecking in some games like like cp2077. I had to actually turn OFF dlss to get more fps lmao

  • @nickvirgili2969
    @nickvirgili2969 9 месяцев назад

    Its awesome, i just gotta upgrade, kinda, from my 6950xt to 7g gen, though im hoping we get some love from the company cause its still a beast on 1440 and 4k so..... Wish nvidia werent greedy m fers.

  • @VYM
    @VYM 10 месяцев назад

    Yeah, yeah. Keyword is gaming CPU. In all multicore benchmark it's performance is 50% from 14700k. Not even saying about stuttering in games. Two really positive moments - very low power consumption and 5nm technological process.
    My 14700k is freaking nightmare working at 100+ temperatures without undervolting and turbo boost power limit.

  • @ceuser3555
    @ceuser3555 10 месяцев назад +1

    Should have compared it to 14900k

  • @skywalker1991
    @skywalker1991 10 месяцев назад +1

    Funny thing is other dudes who are listening to guy tried Amd cpu are looking clueless , its like on their face what is AMD ??? Lol , its almost seems like they wanna ask him , is Amd found in caves ? Is it safe to touch ? Will it kill us , what is this thing AMD , lol

  • @murdergorilla4087
    @murdergorilla4087 8 месяцев назад

    I just got a 7800x3d replacing a 7600x and it did nothing at 1440p

  • @miikasuominen3845
    @miikasuominen3845 10 месяцев назад

    I'll most probably get one now, from BF-sales... Most probably with ASRock X670E Steel Legend and Corsair 6000/cl30 RGB memory.

  • @chrisbullock6477
    @chrisbullock6477 10 месяцев назад

    Would like to hear an update on this Alex. How's things going?

  • @frostkaizen1985
    @frostkaizen1985 10 месяцев назад +2

    stutters are also due to poor game optimization...

  • @DhunterPR
    @DhunterPR 6 месяцев назад

    well the 7800X3D run better with RAM running 6000mhz (AMD EXPO) C30 over the 6400mhz C32 but yea some games have more stutter than other and thats depend on the developers of the game not all on hardware issues every time

  • @Lionheart1188
    @Lionheart1188 10 месяцев назад

    Nice, just bought myself one from Black Friday sales. Already notice the mins & max fps increase.

  • @danijelb.3384
    @danijelb.3384 8 месяцев назад

    How would the 7800x3d fix stutters with consistently worst 1% lows of all the current high end CPUs. The CPU is amazing value for gaming and the energy consumption is magical, but the only thing that will fix stutter issues is further optimizations.

  • @nickvirgili2969
    @nickvirgili2969 9 месяцев назад

    Yes display port is better that way for some reason? I havent fully mapped it yet.

  • @NunYaBeezWax80
    @NunYaBeezWax80 10 месяцев назад

    going from 9900k to a 7800x3d to pair with my 7900xtx.

  • @8aliens
    @8aliens 10 месяцев назад

    I'm running an almost 10 year old CPU (4790k)... could I have your 2 gen old (the horror!) CPU plz.
    The argument of "it's a compute shader it's just maths so the same" view is a bit limited, for a couple of reasons.
    Different GPU's are more performant in different stages of the render pipeline depending on their architecture. Take tessellation for example, which in all uses the Hull-Shader Stage, Tessellator Stage & Domain-Shader Stage. Some GPUs are just better at tessellation than others, especially in the early days of DX11. It's just all down to how the chips architecture handles the staging and instructions it's given.
    The second point. You can do some very funky optimization in HLSL especially with it's multithreading. (You can do 3D arrays of threads, which can even share data! (Which FSR does use.)) Thus if you know exactly which 1 GPU your application is going to be running on, you can do some optimization matching the treads to the GPU architecture, but that's not really possible in the PC space, you're more just relying on how that particular GPU architecture handles mismatched thread to GPU core count.
    As to whether these factors are an advantage/ disadvantage for AMD vs nVidia vs Intel I can't say. It would be hard measure exactly and result could vary across GPU architecture generations, and even which GPU (4070 vs 4070Ti) in said generation is being used.
    I was just pointing it out because not all 1 & 0 / "just maths" are equal.
    (A cache miss can be cause by "just maths" and they can be a real hit to performance. Which is the very reason AMD x3D CPUs get a performance boost in some situations vs other CPU's.)

  • @lgolem09l
    @lgolem09l 10 месяцев назад +1

    Well, the reason people might not notice shader stutter may be because they start looking for them after most of the shades have been compiled. It's not gonna appear more than once in a game until you reinstall the gpu driver

  • @Stevecad_
    @Stevecad_ 10 месяцев назад +1

    7800x3d worth it for 4k over the 7600x?

    • @stevesonger1
      @stevesonger1 10 месяцев назад +2

      Not so much for 4K, but for the longevity of your system, the extra cores and the 3-D V cache is going to last much longer

  • @mick7727
    @mick7727 10 месяцев назад

    Games need to use direct storage. It's there... why not preload shaders and textures?

  • @gordonjeffrey231076
    @gordonjeffrey231076 10 месяцев назад +5

    Couldn't quite afford the ryzen 5800X3D but got the 5800X for my boys aorus b450 motherboard. I'm sure he'll notice the difference from his 2400G.

  • @doc7000
    @doc7000 10 месяцев назад

    In BG3 likely when the frame rates are lower you are hitting a GPU bottleneck which is why both are performing the same seeing as they both have RTX 4090 cards, though when the GPU load is reduced frame rates climb and you start to run into a CPU bottleneck.

  • @Machistmo
    @Machistmo 9 месяцев назад

    @8:38 you think that you would see higher numbers on the left(night) side.

  • @rluker5344
    @rluker5344 10 месяцев назад

    So in heavy CPU use areas like crowded areas in Baldur's Gate 3 or CP 2077 a 7800X3D falls to about even with a stock 12900k with XMP ram.

  • @mrelba9176
    @mrelba9176 10 месяцев назад

    I'm surprised more stuff isn't sent to you guys actually. Do you refuse it to remain objective in your reviews and analysis? Or are they just scared? You're never brutally critical, just objective. You've sold me on multiple GPUs over the years. A little "dated" now, but I have a 5900x/3080 10GB system that still does me perfectly well. You guys gave me much better info than the billions of "benchmark" videos on here.
    Manufacturers need to respect your value to the consumer space more.

  • @RFC3514
    @RFC3514 9 месяцев назад

    3DNow! should be renamed 3DThen.

  • @baka_ja_nai
    @baka_ja_nai 10 месяцев назад +1

    "It's not a generation level difference"(c) - were you living in parallel universe for past decade?
    You would need SEVERAL generations of Intel processors to get 15% performance increase in games.
    But you make such a stupid claim right after new generation of Intel CPUs are released...
    And if you consider performance-per-watt, 7800X3D is multiple generations ahead of 12900k (by Intel's standards).

    • @evilleader1991
      @evilleader1991 9 месяцев назад

      these guys are Intel and Nvidia shills through and through