Ryzen 7 7800X3D Upgrade! The Fastest Gaming CPU vs

Поделиться
HTML-код
  • Опубликовано: 19 ноя 2023
  • ► Watch the FULL video here: • DF Direct Weekly #137:...
    ► Support us on Patreon! bit.ly/3jEGjvx
    ► Digital Foundry RUclips: / digitalfoundry
    ► Digital Foundry at Eurogamer: eurogamer.net/digitalfoundry
    ► Follow on Twitter: / digitalfoundry

Комментарии • 628

  • @MarikHavair
    @MarikHavair 7 месяцев назад +65

    Yep, got a 1700 on an X370 back in 2017, running a 5800X3D off of that now, what an upgrade.

    • @timothyandrewnielsen
      @timothyandrewnielsen 7 месяцев назад +15

      Best upgrade ever, the 5800X3D

    • @MarikHavair
      @MarikHavair 7 месяцев назад +11

      @@timothyandrewnielsen It certainly is the best in socket upgrade I've seen personally.
      AM4 is probably going to be a bit of legend in generations future. We'll be all like 'back in my day'.

    • @ekhyddan
      @ekhyddan 7 месяцев назад +6

      @@MarikHavair I hope that AM5 will have the same upgradability.

    • @dutchdykefinger
      @dutchdykefinger 7 месяцев назад +1

      i started on a r5-1600 on a b350 in may 2017 when it was pretty fresh
      i slotted in a 3700x in on the b350 last year or something :D
      not quite as bitg of a jump you got, but 28% better single threading still ain't nothing to sneeze at, doubling up that l3 cache and the seperated uncore makes it far less memory clock sensitive too, so i'm riding this AM4 bitch until the wheels fall off

    • @timothyandrewnielsen
      @timothyandrewnielsen 7 месяцев назад +2

      @@dutchdykefinger started on x370 with 1800x. Same. 5800X3D until I see CPUs come out with comparable power efficiency and double the speed.

  • @i11usiveman97
    @i11usiveman97 7 месяцев назад +141

    I have the 7800x3d with a 7900xtx and Lords of the fallen still stutters when moving between areas even though it's had many patches to date. Nothing a CPU can do about a poorly optimised game engine.

    • @solo6965
      @solo6965 7 месяцев назад +4

      Nothing is perfect

    • @BornJordan
      @BornJordan 7 месяцев назад +28

      Yeah this game is just poorly optimized, it has nothing to do with CPU

    • @DeepStone-6
      @DeepStone-6 7 месяцев назад +10

      At that point blame the devs thats pretty much an apex PC configuration.

    • @S4_QUATTROPOWER
      @S4_QUATTROPOWER 7 месяцев назад +1

      Getting the 7950X3d instead

    • @100500daniel
      @100500daniel 7 месяцев назад +14

      Dev response: Obviously it's because you have a plebian 7900 XTX instead of an RTX 4090. Quit being Cheap 🫰

  • @SogenOkami
    @SogenOkami 7 месяцев назад +47

    Been using this CPU for the past month on my flight sims and good lord it destroys everything I throw at it!

    • @Teramos
      @Teramos 7 месяцев назад +4

      so does my 14700k@350watt :D damn those performance/wattage is through the roof on the 7800x3d

    • @KiSs0fd3aTh
      @KiSs0fd3aTh 7 месяцев назад +1

      @@Teramos When your 14700k pulls 350 watts it's literally twice as fast than the 7800x 3d. In fact even at 200w it's twice as fast as the 7800x 3d.

    • @coffee7180
      @coffee7180 3 месяца назад +4

      @@KiSs0fd3aTh wtf are you talking i don't get it.

    • @H786...
      @H786... 2 месяца назад +3

      @@Teramos that is cCAPPP 7800x3d is easy to run on an air cooler lol.

    • @a-qy4cq
      @a-qy4cq 2 месяца назад +1

      @@Teramos Your brain performance is defo not through the roof

  • @roki977
    @roki977 7 месяцев назад +132

    what that cpu does with like 50w is most impressive thing for me.. 5800x3d/7800xt here, great for my 240hz screen..

    • @abdirahmanmusa2747
      @abdirahmanmusa2747 7 месяцев назад +1

      What are your frames looking like?

    • @Vinterloft
      @Vinterloft 7 месяцев назад +6

      I waited and waited with a 3570K for the first Ryzen mATX X-chipset motherboard. That's the one thing I will never compromise on. It took until X570 to arrive (thanks ASRock!) and by that time I already could go for 5800X3D as my first upgrade since Ivy Bridge. 11 years, how's that for an update cycle!?

    • @roki977
      @roki977 7 месяцев назад

      @@abdirahmanmusa2747 i can get it over 200 in warzone, close to 300 in Finals beta, i am waiting for that one to come out... nothing fancy, just chaep 3600 cl 16 sticks and b550 mb...

    • @roki977
      @roki977 7 месяцев назад +1

      @@Vinterloft i am not far from that, 3770k and gtx 580 Light extreme was my last serious rig until i bought am4 about 1 and half years ago but i had 5600 and 5700x in there before x3d

    • @mangatom192
      @mangatom192 7 месяцев назад

      @@Vinterloft I got an i3 2120 before I upgraded to an am5 platform (7600). I can DEFINITELY feel the difference.

  • @pottingsoil723
    @pottingsoil723 7 месяцев назад +11

    Would be really great to hear you guys talk about cache-coherent interconnects like CXL providing high-bandwidth PCIe at extremely low latency and if/when we might see this tech migrate from servers to mainstream and the affect it could have on gaming and computing.
    A few years ago we heard all about the PS5s I/O accelerator and how the Oodle Kraken & texture decompression was a game-changer but we've yet to see anything like it on PC (Windows crappy implementation of DirectStorage does not count, in my opinion)
    What are you thoughts on this?

  • @mattx4253
    @mattx4253 7 месяцев назад +20

    i got a 7800x3d and 4090 and its solid as a rock so far. Crazy cool as well as my old intel and 3080 was a room heater....

    • @KiSs0fd3aTh
      @KiSs0fd3aTh 7 месяцев назад +12

      That's the kind of nonsense that is spread around. Your 3080 pulls around 320w, probably three to four times more power than your Intel did (assuming you played 1440p or 4k), your current build probably pulls even MORE power than your old one, but somehow your old one was a space heater. Right, sure buddy. What fucking ever

    • @mattx4253
      @mattx4253 7 месяцев назад +1

      @@KiSs0fd3aTh I can’t explain it other than the 10 core intel always needed max power. The 3080 aurus extreme was OC factory max and I’m now using 2 dims of ram not 4 but my room is cooler. The system doesn’t even try hard in games most of the time. Its just more efficient at hitting 144hz so your thinking is flawed as you are only concerned with max tdp

    • @KiSs0fd3aTh
      @KiSs0fd3aTh 7 месяцев назад

      @@mattx4253 I have a 4090 myself dude, both my 12900k and my 14900k are below 80w (in fact the 12900k hovers around 50-60) playing at 4k. Does you 3d pull 20w or what?

    • @mattx4253
      @mattx4253 7 месяцев назад

      @@KiSs0fd3aTh I had a 10850k clocked 5ghz all core. Pulled mega wattage. My 3080 was the extreme model. Im also not at 4k im at 1600p widescreen 38" This 7800X3d and 4090 Trio X combined just max out my 144hz in the sweet spot of the power curve which is efficient. My 10850k and 3080 were always maxed out and couldnt break 100fps in most games even turned down.

    • @KiSs0fd3aTh
      @KiSs0fd3aTh 7 месяцев назад

      @@mattx4253 So you are comparing a stock with a clocked CPU, right. Are you autistic by any chance? And of course your old build could break 100 fps, you had a 3080, lol.

  • @ralphmiranda2077
    @ralphmiranda2077 7 месяцев назад +3

    I've been rocking this chip since earlier this year… I'm lovin' it!

    • @Makavelli2127
      @Makavelli2127 7 месяцев назад

      I just bought this cpu for my first pc build, cant wait!

  • @BLUEKNlFE
    @BLUEKNlFE 6 месяцев назад

    what are RPCS3 and windows scheduler? how do you enable it?

  • @stanisawkowalski7440
    @stanisawkowalski7440 7 месяцев назад +18

    Guys, just know how I miss your CPU reviews! Direct comparisons with framerate and frametime graphs show much more than typical AVG+LOWs graphs we see everywhere.

  • @jakub_zawadzki
    @jakub_zawadzki 6 месяцев назад

    If I do some productive tasks, mainly video editing, but also play a lot of games, would it be better to get a 7900 or a 7800x3d?

  • @chrisbullock6477
    @chrisbullock6477 7 месяцев назад

    Would like to hear an update on this Alex. How's things going?

  • @dutchdykefinger
    @dutchdykefinger 7 месяцев назад

    even with async shader compiling, if it's a lot of shaders to compile at once, the API (read: overhead for every compile job sent) just kind of gets in the way i guess

  • @ROAMZ101
    @ROAMZ101 2 месяца назад

    Any solid b650 motherboards that don't have weird issues like usb disconnects or bad driver issues? Its the only thing holding me back from switching to team red.

  • @larrytaylor2692
    @larrytaylor2692 7 месяцев назад +11

    Doesn’t matter how good your cpu is when these devs refuse to optimize

    • @stealthhunter6998
      @stealthhunter6998 7 месяцев назад +7

      Or in the sake of unreal engine, the game engine being the problem and devs choosing it.

    • @Shieftain
      @Shieftain 7 месяцев назад +3

      @@stealthhunter6998 Allegedly UE5.4 will bring some pretty big multi-threading optimizations. But given how game development has become this generation, we probably won't see these benefits until 2025 sadly.

    • @portman8909
      @portman8909 7 месяцев назад

      Unreal Engine 5 games have all had stupid system requirements.@@Shieftain

    • @Shieftain
      @Shieftain 7 месяцев назад

      @@portman8909 Indeed, even when software Lumen is in use it's pretty steep. Disabling it boosts FPS a decent amount, but still the requirements are high. IMO games that have a large amount of static lighting are better off developed in UE4 as the older engine is in a good place now (mostly). Take Ghostrunner 2 and Lies of P for example. Both look visually solid and run extremely well.

    • @divertiti
      @divertiti 2 месяца назад

      Nothing to do with optimization, modern games with modern visuals need more than a 6 yr old potato to run on

  • @dotxyn
    @dotxyn 7 месяцев назад +69

    Enabling EXPO at 6400 may have halved the unified clock reducing performance. Check ZenTimings and make sure the UCLK matches the MCLK. If not, you need to adjust the UCLK setting to UCLK=MCLK (not /2). Hopefully your X3D will be able to run 6400 UCLK 1:1, otherwise you may need to drop down to 6200 or 6000.

    • @the.wanginator
      @the.wanginator 7 месяцев назад +17

      You beat me to it. Even AMD states that 6000 CL30 is the sweet spot. Also, make sure the chipset driver is installed.

    • @xouri8009
      @xouri8009 7 месяцев назад

      Isn’t he using DDR4? Or did I misunderstand?

    • @erickelly4107
      @erickelly4107 7 месяцев назад +2

      @@the.wanginator
      This doesn’t mean than 6200+ MT/s is somehow “worse” higher will be better as long as you can run it.

    • @mugabugaYT
      @mugabugaYT 7 месяцев назад +7

      @@xouri8009 7800X3D only supports DDR5

    • @arc00ta
      @arc00ta 7 месяцев назад

      Also, you want to make sure your board didn't ramp up the VDDP with the profile. Asus and MSI are pretty bad with this, thats the DDR5 I/O and anything over 1.000V (default AMD setting is 800mv) can make it super unstable without crashing.

  • @clutch7366
    @clutch7366 7 месяцев назад +1

    Disabling smart access memory helps sometimes, like it did for me in Warhammer Darktide.

  • @mrjiggs8
    @mrjiggs8 7 месяцев назад +18

    I went with the 7800x3d awhile ago.... what's impressed me most.... the reduction in my electric bill.... amazing the power per watt with it and my 7900xtx.

    • @TheJonazas
      @TheJonazas 7 месяцев назад +3

      If you wanted the ultimate reduction in power while maintaining high fps, then the 7800x3d with a 4090 would be much better than your 7900xtx. The 4090 is so efficient when undervolted.

    • @Madhawk1995
      @Madhawk1995 7 месяцев назад +3

      @@TheJonazasas typical of amdTards, they complain about intel cpu power draw while their rdna3 gpu pull that much in difference simply idling. 7800x3d + 4090 is the most power efficient setup.

    • @Jasontvnd9
      @Jasontvnd9 7 месяцев назад +12

      @@TheJonazas Yeah takes a while to make up for the extra $1000 spent on the card in the first place though doesn't it.

    • @BlacKi-nd4uy
      @BlacKi-nd4uy 7 месяцев назад +1

      i changed from an undervolted 13600k with 4800/3000mhz allcore setting with 3600 cl14 ram to a 7800x3d co -30 5800 cl28 setup. around +50% performance by -50% powerconsumption. the powerconsumption was the biggest hit for me. to be fair, under fps lock and gpu limit both have enough power and the powerconsumption is only a little lower on the 7800x3d. only in cpu limiting szenarios the powerconsumption difference is big.

    • @pakjai5532
      @pakjai5532 7 месяцев назад +2

      ​@Madhawk1995, going the 4090 route is like paying for your electricity bills in advance.😂

  • @t3chn0m0
    @t3chn0m0 7 месяцев назад +10

    If you run the memory at 6400MT, please make sure that its running FCLK/MEM Div 1:1, otherwise it's 1:2 and you lose quite some performance.

    • @JFinns
      @JFinns 6 месяцев назад +2

      Great tip, this is why even AMD recommends 6000 MT/s CL30 ram for these CPUs.

  • @SanderSander88
    @SanderSander88 7 месяцев назад +1

    Just bought my pc upgrade, its coming today, got it for 350€ , so thats a good deal for where i life. So excited to get building.

  • @dante19890
    @dante19890 7 месяцев назад

    1.45 . Johns delayed laugh cracked me up XD

  • @nickvirgili2969
    @nickvirgili2969 6 месяцев назад

    Yes display port is better that way for some reason? I havent fully mapped it yet.

  • @gibbles3216
    @gibbles3216 7 месяцев назад +26

    The parallelization is the hard part. You almost have to make a script that will look for CPU's and create profiles based on the core and thread count. This will take time. This is where AI would be very helpful. It can be in the background optimizing threads for a 4, 6, 8, 12, and 16 core CPU. It would run a check to see the core count and execute the the script to start the threading in Windows scheduler or a pre-baked in engine scheduler. This way tasks can be offloaded to various threads based on the criticality. For instance, a thread may just be used to complile shaders in the background constantly. Future engines will need to focus on multi-core threading in earnest. You can have the fastest IPC on the planet. It will mean nothing if the engine cannot properly use most, if not all the cores and threads at its disposal. I have a 5950x. In newer games it is being used to at 7-20% based on what is enabled. I know it is not the fastest CPU on the market in terms of IPC. But, if utilized properly is still very fast. TLDR, the CPU's are not really the problem unless they are budget level CPU's. The utilization of them is and will be till engines are centered around getting the most out of them.

  • @SvDKILLSWITCH
    @SvDKILLSWITCH 7 месяцев назад +2

    Definitely agree with Richard on the 5800X3D. I think it's worth keeping in mind that AMD tried several times to break forwards compatibility on AM4, but the whole platform is better off for their eventual concession and extension of AGESA support to 300-series boards. There are other limitations to consider when using such an old board (limited PCIe speeds and expansion being the main one), but I was able to drop a 5800X3D into the same board I purchased back in March 2017 with a Ryzen 7 1700 - nothing short of a gigantic increase in CPU and especially gaming performance. Newer CPUs and BIOS updates also dramatically increased RAM compatibility to the point where I've gone from having to downclock a two-stick RAM setup from 3200 to 2666 to get stability all the way to four sticks at a lazy 3600 CL16 with headroom for lower timings.

    • @PeterPauls
      @PeterPauls 7 месяцев назад +1

      I did a similar upgrade. I have a MSI B450 Tomahawk Max motherboard with great VRMS, it had a Ryzen 7 2700x in it and I upgraded to a Ryzen 7 5800X3D which was a huge performance leap. My only problem that is only a single M.2 PCI-E 4x slot, so I use a 2TB SSD in it. I put an RTX 4080 next to the CPU only PCI-E x16 3.0 but so far no PCI-E bandwith bottleneck and the CPU is also doesn’t bottleneck the GPU.

    • @Solrac-Siul
      @Solrac-Siul 7 месяцев назад +2

      it is a mixed bag in all honesty. While we do got - because i kind of did the same - a new sparky cpu, we miss on other amenities that a 2017 board like mine didn't had compared with a 2020 or 2021 board, what means that we actually do not take full advantage of the cpu... we get close, maybe 90 to 95% of the real potential.Eventually for professional reasons in early 2022 (February) I actually needed more m.2 4x and because of that , on that specific timeframe, I had to go intel way since the z690s had been out and had 4x m.2 4x slots - and AMD's upcoming platform was by then still multiple months away. So sometimes we do value too much cpu socket upgradeability since while it is nice some 4 years down the road to have an upgrade, in 4 years multiple things moved on and we end lagging on multiple aspects , and that depending on each own's personal needs, can vary from a minor inconvenience to actual time loss , and that is bad when time=money.

  • @bmw523igr
    @bmw523igr 7 месяцев назад +3

    Great timing. Yesterday changed my platform from AM4 5600 to AM5 7800X3d with 7800XT. Setup is not complete(missing one more M2 drive), but I can't wait to use it.

    • @ZackSNetwork
      @ZackSNetwork 7 месяцев назад

      You are GPU bound however nice CPU.

  • @prototype8137
    @prototype8137 7 месяцев назад +3

    Jeez. The star citizen footage is like night and day. 😅

  • @kravenfoxbodies2479
    @kravenfoxbodies2479 7 месяцев назад

    I bought a combo deal back in July of this year at Best Buy for Ryzen 5 7600x / Gigabyte B650 Aorus Elite AX / Starfield for $289.99usd , I have yet to get the board to update the bios, QFlash always says bad image.
    All my MSI boards update from QFlash without issues from same 2.0 UBS flash drive, this Gigabyte board wants to be so different.

    • @nivea878
      @nivea878 6 месяцев назад

      Take bios F2 if you use usb-stick change name to gigabyte.bin (doesnt matter if big or small letters)

  • @jergernice1
    @jergernice1 7 месяцев назад +2

    just built my 7700x seems to work great. have no idea what scheduling hes talking about adjusting?

    • @gellideggti
      @gellideggti 7 месяцев назад

      It will not be a problem for the chip you have mate its more for the 16 core duel ccd. one with 3d one without . The use the Microsoft game stuff to sort what half of the chip to use. 3d half or standard half.

  • @icky_thump
    @icky_thump 7 месяцев назад +41

    Just upgraded to this when MicroCenter put it on an early Black Friday sale for only $299! It wuickly went back up to $369 like a day later 😅. So far so good. My games 1% lows seem a lot stronger, GPU stays near 99-100% more often, and strangely it runs cooler than my 7700X.

    • @phantom498
      @phantom498 7 месяцев назад

      What’s your gpu?

    • @DarthSabbath11
      @DarthSabbath11 7 месяцев назад

      Just bought mine for 398, ffs 😔

    • @ionnicoara6657
      @ionnicoara6657 7 месяцев назад

      You can ask for a price match. I asked newegg. They didn't give 299 but went to 349

    • @andrewmorris3479
      @andrewmorris3479 7 месяцев назад +7

      Firstly you scored on that ridiculous price! Secondly, the 7800X3D pulls about 30-40% less wattage in gaming and almost 50% less in multi core workloads.

    • @BiscuitMan2005
      @BiscuitMan2005 7 месяцев назад +1

      Wow and I thought 369 was a deal from them when I got it last week there. I have the worst luck

  • @RhythmicJAY
    @RhythmicJAY 7 месяцев назад

    no fresh OS install?

  • @45eno
    @45eno 5 месяцев назад +1

    I went from 5900x to 5800x3D to 5800x to 7600x to 7800x3D in the last 10 weeks. Vanilla 5000 series definitely didn’t let my 6900xt breath as well compared to both 3D and 7600xt.
    Snagged smoking prices with very very little out of pocket. 7800X3D was a local buy but was sealed brand new for $340. Sold my 5800x3D for $280.

  • @dylanbeazley6739
    @dylanbeazley6739 7 месяцев назад

    Hey guys! Was wondering if you think running a stream, vtubing and discord while gaming is possible on the 7800x3d without dropping 1% lows in valorant. Trying to do all that and have 360hz feel like 360hz while streaming haha

    • @arp1
      @arp1 7 месяцев назад

      Yes.

    • @dylanbeazley6739
      @dylanbeazley6739 7 месяцев назад

      Oh nice you running the same chip? If so what are your 1% lows looking like while streaming?@@arp1

  • @AshtonCoolman
    @AshtonCoolman 7 месяцев назад +8

    I have a 7800X3D, RTX 4090, and CL30-36-36-76 DDR5 6000 all on an ASUS X670E Crosshair Hero and my system is flying.

    • @baghlani92
      @baghlani92 7 месяцев назад

      I had cl30 38 38 96 is it good one?

    • @Strengthoftenmen
      @Strengthoftenmen 7 месяцев назад

      I have the exact same set up 7800x3D, 4090, x670e hero. Buuut Initially I was running corsair vengeance 5600 but the system kept crashing. Just purchased some G skill 6000hz cl30 hoping it will improve things.

    • @AshtonCoolman
      @AshtonCoolman 7 месяцев назад

      @@Strengthoftenmen Have you updated your BIOS to the latest version? It's a necessity on these boards. The BIOS this board shipped with has unstable DDR5 support and also will kill your 7800X3D due to too much SoC voltage.

    • @Strengthoftenmen
      @Strengthoftenmen 7 месяцев назад

      @@AshtonCoolman thanks for the reply. Yes I have updated the bios. I think the mb is just very sensitive to timings??

    • @FearlessP4P1
      @FearlessP4P1 7 месяцев назад

      @Strengthoftenmen how has the new ram been working? I’ve heard a few issues Corsair and another ram, but I’ve yet to hear any issues with Gskill

  • @Train115
    @Train115 7 месяцев назад

    Ive been using one for a few months now, its strong. :)

  • @razzorj9899
    @razzorj9899 7 месяцев назад +3

    I just upgraded from a ryzen 5600 to the 7800x3d and wow what an upgrade more then doubled my fps in warzone 2 and mw3 and I play at 3440x1440 so yes cpu matters at higher resolutions as well

    • @mttrashcan-bg1ro
      @mttrashcan-bg1ro 6 месяцев назад

      CPU matters at 4k especially with a 4090, my 4090 is pretty much never at 99% usage on any game at 4k Ultra, and RT makes matters worse because a 5900X just can't do it. But seeing a 7800X3D and 14900K on benchmarks, they only do like 30% better which isn't worth spending another $2000AUD just yet.

    • @xbox360J
      @xbox360J 5 месяцев назад

      Same here !!! That cpu upgrade has been awesome along with my 3080.

  • @JoePolaris
    @JoePolaris 7 месяцев назад

    Have guys look at ram tuning , matching infinity cache speed

  • @HallyVee
    @HallyVee 6 месяцев назад

    Just did this upgrade myself, but can't get the am5 ram timings right, can't get the nvme format right, even the gpu had a clearance issue. Think I'm done with building, it was fun way back when, when the most complicated part was mb compatibility etc. Now I gotta browse 14 different forums...

    • @nivea878
      @nivea878 6 месяцев назад

      its AM5 plattform, hit or miss, Intel is superior i have 7800x3d and 14700K , AMD every time something meehh happends

  • @rluker5344
    @rluker5344 7 месяцев назад

    So in heavy CPU use areas like crowded areas in Baldur's Gate 3 or CP 2077 a 7800X3D falls to about even with a stock 12900k with XMP ram.

  • @supraken123
    @supraken123 7 месяцев назад

    Go on lads.

  • @12me91
    @12me91 7 месяцев назад +17

    While I would have liked to have seen the 5800x3d be used in its day I'm loving the fact that you'll be using a 3d chip finally

    • @selohcin
      @selohcin 7 месяцев назад +1

      Took the McDonalds commercials a little too seriously, did ya? 😂

    • @SpecialEllio
      @SpecialEllio 7 месяцев назад +1

      @@selohcin loving something isn't a commercial exclusive and he didn't even say the mcdonalds quote

    • @puregarbage2329
      @puregarbage2329 7 месяцев назад +4

      Hardware unboxed used it in almost all of their benchmarks when it was still relatively new. It’s conclusively better than 10900k, and pretty close to 12900k, sometimes slightly better. 7800x3d is easily the best gaming cpu when you consider the price.

  • @guntherultraboltnovacrunch5248
    @guntherultraboltnovacrunch5248 6 месяцев назад +1

    As a guy who lives in the Arizona desert, I'm just happy with how it sips power and runs cool.

    • @M41NFR4M3Gaming
      @M41NFR4M3Gaming 5 месяцев назад

      I live in Arizona too. I’ve had both. My 14900k runs cooler than the 7800x3d. Returned the 7800.

  • @ryanocallaghan8833
    @ryanocallaghan8833 6 месяцев назад

    Does Star Citizen still have issues with E-cores? Could explain the large performance delta and stuttering on the 12900K.

    • @daweitao2668
      @daweitao2668 6 месяцев назад

      E-cores issue fixed under W11 (but not W10). SC is just not optimised yet and the extra cache really helps in titles where memory usage isn't tuned in so well.

    • @ryanocallaghan8833
      @ryanocallaghan8833 6 месяцев назад

      @@daweitao2668 Thanks. My biggest concern with the 3D V-Cache CPUs is consistency - there is a somewhat larger gap between the highs and lows compared to other CPUs. Factorio is a good example, with map sizes that "fit?" in the cache seeing large frame rate boosts, while increasing the map size to make it no longer fit in the cache and rely more on memory access makes Intel CPUs take the lead again. I've got a 13900K in my gaming machine and I find it to be an extremely reliable performer in all CPU intensive gaming scenarios.

  • @andrewg2536
    @andrewg2536 6 месяцев назад

    anyone know if I would see much improvement going from RTX 4080 with 5900X to 7800X 3D for 4K gaming on LG CX

    • @mttrashcan-bg1ro
      @mttrashcan-bg1ro 6 месяцев назад +1

      Most likely a noticeable difference, but the best way to know is by monitoring your GPU usage with the framerate uncapped. I know a 5900X isn't enough for a 4090 at 4k, the 4080 is going to be far less bottlenecked but I'd imagine to some extent it still is and the 4070 is probably the fastest card I'd pair with a 5900X

  • @bradcole2015
    @bradcole2015 4 месяца назад

    Why would you use HDMI over DP? also use DLAA instead of DLSS

  • @Lionheart1188
    @Lionheart1188 7 месяцев назад

    Nice, just bought myself one from Black Friday sales. Already notice the mins & max fps increase.

  • @that_colin_guy
    @that_colin_guy 7 месяцев назад +2

    3:15
    Displayport Superiority 🤘🏻

  • @rob4222
    @rob4222 7 месяцев назад

    I want more CPU benchmarks. CPUs do matter too

  • @benknapp3787
    @benknapp3787 5 месяцев назад +1

    I have this CPU coming later this week along with a 7900XTX for my very first build. Should be fun!

    • @keynoteFlyer
      @keynoteFlyer 17 дней назад +1

      I’ve just ordered the same combo , how’s it been?

    • @benknapp3787
      @benknapp3787 17 дней назад

      @@keynoteFlyer It's been absolutely amazing. I run a 65" LG C2 and two 24" Dell side monitors. I run every game I play at 4K Max settings and it crushes them.

    • @keynoteFlyer
      @keynoteFlyer 17 дней назад +1

      @@benknapp3787 thanks for the quick reply Ben. That sounds amazing. My setup is mainly for flight sim and a few Rockstar games. I’m looking forward to this massive upgrade from my old i7 with 6600XT. 😀👍🏻

    • @benknapp3787
      @benknapp3787 17 дней назад

      ​@keynoteFlyer You won't be disappointed. What version of the 7900XTX did you order? I'm running the Merc 310. It hits about 450w with the power limit raised but it's worth the extra performance in demanding games.

    • @keynoteFlyer
      @keynoteFlyer 17 дней назад +1

      @@benknapp3787 I’ve ordered the ASRock 7900 XTX Taichi edition. Both of my previous cards have been Sapphire Nitro + versions but I can’t find any of those in the 7900XTX. The Taichi sounds like a decent equivalent. What Power Supply did you go for? The AMD website recommended 1000W but the builder wanted to give me 850W (based on the lower power draw of the 7800X3D I think)

  • @HardwareAccent
    @HardwareAccent 7 месяцев назад +4

    Is it possible that some traversal stutters can be affected by RAM and/or NVME speed?

    • @Chilledoutredhead
      @Chilledoutredhead 7 месяцев назад +2

      Its a fair bet he has the games on a good nvme drive, and his RAM is a really good high speed kit. So my educated guess would be no. Just games that are not designed for pc, because companies just want to optimise for console.

  • @nemz7505
    @nemz7505 6 месяцев назад

    The 3DNow patch for the original Freespace was pretty impressive to be fair ;)

  • @elbowsout6301
    @elbowsout6301 7 месяцев назад +1

    I just built a new rig 6 weeks ago with a 7800X3D and an RX 6950 XT and I couldn't be happier. It's been a few years since I have built a PC from scratch and the build process was seamless. Go treat yourself to some 6000 MT/s RAM.

    • @godblessusa1111
      @godblessusa1111 6 месяцев назад

      weak card for this processor...need min rtx 4080

  • @Fefeallday.
    @Fefeallday. 7 месяцев назад +2

    I Have a question, I also have the 7800x3d, but my concern is the temperatures... what would the temperatures be like when playing? I have a 240mm nzxt kraken and I find them high when playing at 70c to 75c. Help!

    • @f-35x-ii
      @f-35x-ii 7 месяцев назад

      What will help is the undervolt, I did -30 on all cores on my 5800x3d, and wow it keeps temps down and I even see better performance

    • @mugabugaYT
      @mugabugaYT 7 месяцев назад +7

      70-75°C is perfectly fine. As long as you're below 95° you're good.

    • @kevinfromsales6842
      @kevinfromsales6842 7 месяцев назад +1

      Mine gets kinda toasty too. Shoots right up to 82 when stress testing but stays around 65 to 70 when gaming. I don't think 75c will kill a 7800x3d but I'm not sure.

    • @morbid6944
      @morbid6944 7 месяцев назад +4

      That's quite fine temperatures while gaming. Don't worry!

    • @randomguydoes2901
      @randomguydoes2901 7 месяцев назад

      70-75 high? haha. curve optimize like everyone else or buy a console and stfu

  • @f-35x-ii
    @f-35x-ii 7 месяцев назад +5

    I'm still going strong with my 5800x3d , and i even undervolted to -30 per core and wow temps are good and performance is top notch still! Im not gonna upgrade cpu for maybe 8 years, and get whatever is top price to performance then. (Edit from all core to per core -30 undervolt)

    • @devilmikey00
      @devilmikey00 7 месяцев назад +1

      AMD CPU's are great in that undervolting usually increases performance. I had a 5700x for a bit and using PBO and an undervolt got me 250mhz more boost while running cooler. I know the X3d parts can't raise their boost but I bet it's staying at it's max boost longer and more consistently now that you've done that.

    • @f-35x-ii
      @f-35x-ii 7 месяцев назад

      @@devilmikey00 correct, mine stays at max boost all the time it doesn't even go down at all while gaming, on was locked in earlier BIOS, but with newest BIOS it is easily changed!

    • @teddyholiday8038
      @teddyholiday8038 7 месяцев назад +2

      In running all core -30 on my 5800x3D too
      It’s a beast CPU but I don’t think it’s gonna last me 8 years. There are plenty of recent games just hammering CPUs
      I’m skipping Zen4 but I’ll consider whatever the Zen5 equivalent is (9800x3D?)

    • @exoticspeedefy7916
      @exoticspeedefy7916 7 месяцев назад +2

      Per core is better, otherwise you could actually loose performance if you have too much of a negative offset

    • @f-35x-ii
      @f-35x-ii 7 месяцев назад +1

      @@exoticspeedefy7916 oh sorry , that is what I meant, per core, just double checked my BIOS

  • @fracturedlife1393
    @fracturedlife1393 7 месяцев назад

    ACC massively prefers the 3d vcache too. Many games have crazy uplift, and almost as many others would prefer non X3D.

  • @miikasuominen3845
    @miikasuominen3845 7 месяцев назад

    I'll most probably get one now, from BF-sales... Most probably with ASRock X670E Steel Legend and Corsair 6000/cl30 RGB memory.

  • @georgeindestructible
    @georgeindestructible 7 месяцев назад

    11:57 - 12:57 yet that's literally a CPU killer right there, my 5800x was struggling so hard there.

  • @jadedrivers6794
    @jadedrivers6794 7 месяцев назад +2

    Being able to run a 5800X3D on a motherboard from 2017 is amazing! Getting a 35% increase in fps going from a R5 3600 to a 5800X3D on 1440p (source: HUB) is mind boggling

    • @markotto1767
      @markotto1767 7 месяцев назад

      With the same gpu ? What gpu do you have ?

    • @jadedrivers6794
      @jadedrivers6794 7 месяцев назад +1

      @@markotto1767 I have a 7900 XT. HUB tested with a 6950 XT. There is around a 15% performance gap between the two cards but the results should be very similar.

  • @vulcan4d
    @vulcan4d 7 месяцев назад +3

    I hope you guys do memory tuning and compare. It is amazing how close the 7700x is to the 7800x3d is when you have tight memory timings. The secret sauce is that the 7700x has better latency up to 16mb cache but above the 7800x3d wins however with tight memory timings it isn't that much of a difference any more.

    • @JsGarage
      @JsGarage 7 месяцев назад +2

      But what’s the point? Pay a little more get an overall faster CPU for gaming and don’t have to mess with memory tuning or risking corruption when you don’t get your timings just right. Guess if you love tweaking get a 7700X and feel special that you got there differently. For most people it’s a waste of time for what you’d save tweaking memory on 7700X instead of 3D route.

  • @mick7727
    @mick7727 7 месяцев назад

    Games need to use direct storage. It's there... why not preload shaders and textures?

  • @marcotomiri3440
    @marcotomiri3440 7 месяцев назад +1

    i need the 7950x 3d for cities skilines 2, 7800 has too few cores for that game in late game, with more than 500k people

  • @J_Brian
    @J_Brian 6 месяцев назад

    What helped me is upgrading my RGBs to OLEDs

  • @spiritualantiseptic
    @spiritualantiseptic 7 месяцев назад +43

    I jumped from R5 7600 to 7800X3D solely because DF made me conscious of stutters.

    • @BigGreyDonut
      @BigGreyDonut 7 месяцев назад +5

      I made that exact same jump bc I had a “It’s time” moment once it went on sale again

    • @spiritualantiseptic
      @spiritualantiseptic 7 месяцев назад +10

      @@BigGreyDonut To be fair though the performance jump isn't actually worth the price because R5 7600 is a great value but man, the DF is helping hardware producers 😂

    • @BigGreyDonut
      @BigGreyDonut 7 месяцев назад

      @@spiritualantiseptic hey now…don’t actually use reason🥲

    • @DeepStone-6
      @DeepStone-6 7 месяцев назад +10

      I got the 5800X3D because it ran Star Citizen the best out everything out at the time but then I realized every other game I play ran much smoother so now if it doesn't have 3D at the end I don't want it.

    • @nightspiderx1916
      @nightspiderx1916 7 месяцев назад

      @@DeepStone-6 I had the 5800X3D and also the 7800X3D but i might stay with the Zen5 without V-Cache because the Zen architecture likes tuned DRAM.
      In Star Citizen a tuned Zen4 is almost faster than a Zen4 with V-Cache because some games want the +600 Mhz and for some games the additional 64MB L3 is not enough.
      There are testers out there on RUclips who compared the latest intel and amd CPUs with tuned DRAM and with tuned DRAM Intel is often ahead, like in Star Citizen.
      So i think about buying the ""vanilla"" zen5 and tune it as good as i can with DRAM 6400 and beyond (subtimings).
      ruclips.net/video/HzPrGv22R7g/видео.html

  • @FooMaster15
    @FooMaster15 7 месяцев назад +2

    Some problems you just can't brute force with sheer CPU power... I hope UE5 gets more CPU optimized soon.

    • @mttrashcan-bg1ro
      @mttrashcan-bg1ro 6 месяцев назад

      UE5 doesn't seem to have CPU problems as much, it scales pretty well with core/thread count and speed/IPC, the UE5 games are just really GPU heavy. A lot of these newer games with CPU problems such as Jedi Survivor are still using UE4, the ones on UE5 that are CPU dependent probably just want faster chips

  • @kylecorona8176
    @kylecorona8176 2 месяца назад

    how is a 4090 getting 80 frames on jedi survivor at 1080p wtih dlss that doesnt make sense at all

  • @fuckyoutubrforchangingthis
    @fuckyoutubrforchangingthis 6 месяцев назад

    Is my 7800x3d + 7900xtx speedster black + 990pro ssd + gigabyte 650 aurous elite ax + g Skill flare 5 ram ddr5 6000mhz. A viable set.up? Im in my return period...should i get a 4080? Orrr switch my cpu or somethin? It games very well atm, buttt.......yeah plz respond

  • @elvertmack5039
    @elvertmack5039 7 месяцев назад +3

    I went AMD...but only half way. Got a 12900k combo deal with cpu,motherboard, and ddr5 6000mhz ram for like 400 bucks whole combo and paired it with a 7900xtx and I tell you...this combo is butter smooth. I also have a 13700k/4080 and it must say that the 7900xtx system feels smother than my 4080 system. Both are on a 34 inch Alienware oled monitor at 1440p and getting beautiful results.

    • @VYM
      @VYM 7 месяцев назад

      Worst combo ever. Extremely power hungry CPU and same GPU together is a big fail. You should've bought 7800x3d and RTX4070 with 550 watts PSU.

    • @mttrashcan-bg1ro
      @mttrashcan-bg1ro 6 месяцев назад

      @@VYM Most people don't care about that. I'd personally rather AMD CPUs in my system with a 4090 so I don't risk saturating my 1000w PSU, but if I did go Intel, I wouldn't care that much about the extra 100w. Most people just wanna run games and get good performance with manageable temps. Despite my 5900X only getting a max of 145w under full load, it's so ridiculously hot anyway

    • @VYM
      @VYM 6 месяцев назад

      @@mttrashcan-bg1ro I cut down my 14700k turbo boost power from 253/135 to 110/110 and all core boost ratios from 55/56 to 42. Lost 14% performance but temperatures now not over 65-70 degrees with 26000 in Cinebench dropped from 30500.

  • @dex4sure361
    @dex4sure361 6 месяцев назад

    6:08 I also use this scene to test CPU single thread performance in games. Crysis engine is just very single thread heavy and that scene just always maxes out single thread and creates huge GPU bottleneck.

  • @CruitNOR
    @CruitNOR 7 месяцев назад

    8:26 Mate the difference is literally night and day!

  • @Raptor999_
    @Raptor999_ 7 месяцев назад

    I quit playing apex legends because after optimizing everything it has just stopped working anymore without constant lag and stutters my old 1080 ti is struggling out here

  • @lotrcod4life
    @lotrcod4life 7 месяцев назад

    I'm planning on upgrading from my 12700k with 32 gb ddr4 3600 to a 7800x3d ddr5 6000 cl30. Not sure what to expect but this makes me excited for the boost that I should get. Anyone have any idea how big a boost I could see?

    • @mitchmurray2260
      @mitchmurray2260 6 месяцев назад

      boost for games? if youre at 1080p, you might see 10 to 20% jump in some games. but if you also do work production and not just games, a 14700k would be a better upgrade as the 7800x3d is really just a cpu that runs the fps up at 1080p but performs awful when compared to the newer Intels in work production.

    • @lotrcod4life
      @lotrcod4life 6 месяцев назад

      yea just gaming at 3440x1440@@mitchmurray2260

    • @daweitao2668
      @daweitao2668 6 месяцев назад

      No need to upgrade your current CPU tbh, it's still strong.

  • @nickvirgili2969
    @nickvirgili2969 6 месяцев назад

    Its awesome, i just gotta upgrade, kinda, from my 6950xt to 7g gen, though im hoping we get some love from the company cause its still a beast on 1440 and 4k so..... Wish nvidia werent greedy m fers.

  • @ivantsipr
    @ivantsipr 7 месяцев назад

    Oh no, not the Aorus X670E Master 😮
    😅ok I reach the point where you said you updated the bios, ok 👍

  • @yousuff1
    @yousuff1 7 месяцев назад +1

    With curve optimizer you can make the 5800X3D/7800X3D incredibly efficient.
    -30 all cores for me, rarely breaks 50c in games.

    • @exoticspeedefy7916
      @exoticspeedefy7916 7 месяцев назад +1

      Better to test Per core. You can loose performance if you have too much negative offset

    • @yousuff1
      @yousuff1 7 месяцев назад

      no loss in performance or stability, i ran the necessary tests for longer than suggested. The 5800x3d has been binned quite well, loads of people manage -30 all core. @@exoticspeedefy7916

  • @Kapono5150
    @Kapono5150 7 месяцев назад +2

    Just picked up the 7800X3D. Shout out to Intels “Core Truths”

  • @doc7000
    @doc7000 7 месяцев назад

    In BG3 likely when the frame rates are lower you are hitting a GPU bottleneck which is why both are performing the same seeing as they both have RTX 4090 cards, though when the GPU load is reduced frame rates climb and you start to run into a CPU bottleneck.

  • @johnhughes9766
    @johnhughes9766 7 месяцев назад

    The limiting thing you speak of if ram latency

  • @JamesSmith-sw3nk
    @JamesSmith-sw3nk 7 месяцев назад

    I watched some benchmark videos from a youtube channel called "Terra Nigma" and there is a margin of error in the difference of fps at 4k max settings using a 4090 and the difference between a ryxen 5900 and a 7800xt. Most AAA games today are still gpu bound at 4k even with a 4090. I was surprised.

    • @NulJern
      @NulJern 7 месяцев назад

      it all depends on what resolution you play at. if you lower your resolution you will become CPU bound at some point and not many play at 4k

    • @lgolem09l
      @lgolem09l 7 месяцев назад +1

      4k is a ridiculous resolution that was never viable for actually top of the line games without dlss.

    • @NulJern
      @NulJern 7 месяцев назад

      @@lgolem09l do you play at 4k?

    • @lgolem09l
      @lgolem09l 7 месяцев назад +1

      @@NulJern yes, previously with a 3080 and then changed to a 4090. I mean you can see in the benchmarks that modern games are often even cpu bound if you want to reach 120fps.

    • @NulJern
      @NulJern 7 месяцев назад +1

      @@lgolem09l well what kind of cpu did your pair with your gpu if you are cpu bound at 4k? You should be gpu bound at 4k, not cpu bound. I get 144+ in most games in 4k, and i'm always gpu bound, even tho i only have 8 core cpu with stock cooler.

  • @user-cf4us2lz5s
    @user-cf4us2lz5s 7 месяцев назад

    It is really a generational difference and you've shown that.

  • @Typhon888
    @Typhon888 7 месяцев назад +2

    Every real hardware enthusiasts already know ow X3D is just hype. It’s good at 10 games with a $2,500 GPU… then look at the multithreaded performance is worse than my 11900k from 2019.

    • @mitchmurray2260
      @mitchmurray2260 6 месяцев назад

      exactly... its strictly a 1080p chip to run up the fps

    • @evilleader1991
      @evilleader1991 6 месяцев назад

      lol what?? x3d really improves the 1% and 0.1% lows lmao@@mitchmurray2260

  • @kybercrow
    @kybercrow 5 месяцев назад

    Yeaaaaah, the point they made in the first 5 minutes was very apt. People are very quick to blame others' hardware, and for some reason hardly ever want to attribute major issues to game optimization. Yes a better CPU will give better performance, as this video shows, but people are very hyperbolic on this point-big issues are likely to be the game.
    I have the 7700X. Avatar runs like butter on ultra settings with the occasional hitch here & there, but I get frequent stutter in CP 2077 after patch 2.1.

  • @fracturedlife1393
    @fracturedlife1393 7 месяцев назад

    Damn, the Master???? 🧐

  • @Spartan_83
    @Spartan_83 7 месяцев назад +1

    I have a 7950X. should I stay with that CPU or would it be better to "upgrade" to the 7800x3d or 7950x3d or just wait for the newer cpus?

    • @drsm7947
      @drsm7947 7 месяцев назад

      just wait for next gen cpu

  • @Nintenboy01
    @Nintenboy01 7 месяцев назад +2

    Based on leaks/rumors Zen 5 may not be a massive leap in terms of IPC - Zen 2 to 3 was apparently a bigger jump. If you're already on Zen 4 it seems more prudent to wait for Zen 6 or even 7 or see how Intel's 15th or 16th Gen do

    • @_N4VE_
      @_N4VE_ 7 месяцев назад

      I have a 79xtx and been debating on upgrading to the 7800x3d from my 5800x3d. After many videos it seems it’ll get me roughly 15-20fps gain in general, some games don’t show much improvement at all. After pricing everything I need, i’ll be spending $725.00. Is that even worth it? Or should i just stick to my 58x3d until next gen?

    • @Nintenboy01
      @Nintenboy01 7 месяцев назад +1

      @@_N4VE_ probably best to skip at least 1 Gen. Maybe wait for the 8800X3D

    • @_N4VE_
      @_N4VE_ 7 месяцев назад +1

      @@Nintenboy01 Thanks, makes sense. Id rather spend that kinda money and get a good jump in performance rather then a minor one. Guess i’ll just have to live with it for now, 58x3d still a great chip and is doing well with the 79xtx.

    • @Nintenboy01
      @Nintenboy01 7 месяцев назад +1

      @@_N4VE_ yeah, in many games it's also almost as good or even slightly better than the 12900K

    • @_N4VE_
      @_N4VE_ 7 месяцев назад +1

      @@Nintenboy01 glad i looked into it more and got second opinions. I almost impulse bought an upgrade that I don’t really need lol. The extra frames would be nice but i don’t see it justifying dropping $700< for it

  • @o0Silverwolf0o
    @o0Silverwolf0o 7 месяцев назад +1

    I have that board for my 7800x3d/4090 build its been great. I did a lot of research before choosing a board and heard that the pcie architecture on this board was a bit different then most allowing better performance and checked out the diagrams that showed why.

    • @Curious_BeingNX
      @Curious_BeingNX 7 месяцев назад

      which one? didn't get it

    • @o0Silverwolf0o
      @o0Silverwolf0o 7 месяцев назад +1

      @@Curious_BeingNX the Gigabyte Aorus X670E Master.

  • @nickvirgili2969
    @nickvirgili2969 6 месяцев назад

    Gotta have that 13 extra fps!!😂😂 i just wish i could afford the best cause i love pushing the limits of tech. Cpus for sure from sbcs to full on pc fire. Laptops with external gpus custom made to run harder, i love getting back into this ish. Gaming as a mark comes down to software too these days.

  • @Cernunn0s90
    @Cernunn0s90 7 месяцев назад +11

    I can't stand stutters, and recent years with shader compliation stuttering in so many games is honestly driving me nuts. Something has to be done.

    • @stealthhunter6998
      @stealthhunter6998 7 месяцев назад +7

      It’s not just shader it’s traversal stutters which r even worse. Straight up stutters all the time not just one time. Like dead space remake. 100% the worst part of PC gaming rn.

    • @liftedcj7on44s
      @liftedcj7on44s 7 месяцев назад +1

      It has made PC gaming no fun for myself, the way these developers are coding these games is just stutterfest everywhere.

    • @michaelzomsuv3631
      @michaelzomsuv3631 7 месяцев назад +1

      ​@@stealthhunter6998 Dead Space original can still be played. I played it recently, it was the smoothest experience with constant 240fps not a single frame drop on a 240hz monitor. Way more fun than any shitty lagfest that are 'modern' games and remakes.

    • @stealthhunter6998
      @stealthhunter6998 7 месяцев назад +1

      @@michaelzomsuv3631 Ik… the quality of older games is just the best. Was playing Halo CE in OG graphics again and was amazed by the fact they had RTX like effects in the glass. U can see chief in the reflection of the floor in the control room crystal clear. Sure it’s dated but for back then that’s a huge achievement real time lighting effect. Just so much care and effort put into them compared to today’s stuff.

  • @roymarr6181
    @roymarr6181 6 месяцев назад

    should be comparing the Intel i9 13900k seems how it came out close to the same time as the Amd Ryzen 7 7800x3d, and i bet you the test would be closer, and last to benchmark cpu's you don't bench at 4k, you bench at lower res like 1080p etc

  • @ymi_yugy3133
    @ymi_yugy3133 7 месяцев назад

    It kind of depends what's causing the stutter, but it's not impossible that even a slightly lower class CPU has massive stutters where a faster one doesn't.
    Imagine something like memory management. As you play you keep accumulating garbage that needs to be freed. If the CPU is idle, it runs the garbage collection, if it doesn't it delays the task.
    A fast CPU might have enough capacity to run the garbage collection before having to render the next frame, but slower one delays it until it goes over some memory limit and then suddenly has a much more expensive garbage collection process. This is of course a contrived example and no sane programmer would program it like this. But when you have many, many tasks, weird things happen.

  • @Fletcher959
    @Fletcher959 7 месяцев назад

    jts a setting in bios

  • @westyk52sparky
    @westyk52sparky 6 месяцев назад

    i just done this jump today. but i had a 3700x. i need to to wait for 8000 but its xmas

  • @Marco-pf3te
    @Marco-pf3te 7 месяцев назад

    Hey guys question! Is it worth upgrading from a ryzen 3700x to the 5800x3D? My gpu is 6800xt and I plan on buying an C2 oled 42'' 120hz

    • @scottanderson9414
      @scottanderson9414 7 месяцев назад

      Yes

    • @Marco-pf3te
      @Marco-pf3te 7 месяцев назад

      @@scottanderson9414 will I not be gpu limited on 4k anyway?

    • @scottanderson9414
      @scottanderson9414 7 месяцев назад

      @Marco-pf3te if you are gpu bound you will inherently want to turn on fsr which will increase demand on cpu. Higher fps higher cpu usage. Imo it'd be worth it. Look at 3700x vs 5800x3d 1440p resolution to see what kind of difference you'd be looking at

    • @scottanderson9414
      @scottanderson9414 7 месяцев назад

      @Marco-pf3te also depends on what your plan is. If it were me I'd probably wait for a gpu upgrade then do whole switch to am5

    • @Marco-pf3te
      @Marco-pf3te 7 месяцев назад +1

      @@scottanderson9414 I see what you mean. Thanks for your response!
      The thing is I will probably wait at least a few years before making the AM5 + GPU upgrade.
      It's just that I want to make most of my new OLED now that Ive finally got my hands on the tech. And this is a relatively cheap upgrade when selling my old cpu

  • @TechButler
    @TechButler 7 месяцев назад

    I still have a 12 year old CPU in one computer that's playing games like the latest SIMS at above 30 fps on a 1660 Super and 16Gb of RAM. It can be done.

  • @kapono2916
    @kapono2916 7 месяцев назад +1

    Digital Foundry switches from Intel to AMD, wow that’s BIG news

  • @AdadG
    @AdadG 7 месяцев назад

    "[...] but it's not like, it's not a generation little difference" Alex said. But... what generation are we talking about, from the 10th to the 11th or from the 13th to the 14th generation of Intel? If so, there are like 5 generations of those together.

  • @Machistmo
    @Machistmo 6 месяцев назад

    @8:38 you think that you would see higher numbers on the left(night) side.

  • @FNXDCT
    @FNXDCT 24 дня назад

    i am very surprised. I thought my 5800x3d was stuttering with shaders but i notice even the lastest cpu do ! :o
    The games should find ways to preload the shaders before.
    The last of us part 1 was preloading shaders and i can tell you with 5800x3d and 4060ti i've had no stutters at all, it was buttery smooth.
    Unreal engine 5 games are the worst about stutters for now, they are stuttering very hard on nvida cards, especially when you start the game and the next 30minutes of gameplay

  • @Stevecad_
    @Stevecad_ 7 месяцев назад +1

    7800x3d worth it for 4k over the 7600x?

    • @stevesonger1
      @stevesonger1 7 месяцев назад +2

      Not so much for 4K, but for the longevity of your system, the extra cores and the 3-D V cache is going to last much longer

  • @awdrifter3394
    @awdrifter3394 7 месяцев назад +3

    It's kind of disappointing that AMD didn't push the 7800x3d to 10 or 12 cores.

    • @mttrashcan-bg1ro
      @mttrashcan-bg1ro 6 месяцев назад

      Since they've managed to get 16 core CCXs on the newest I think Threadripper CPUs, I'd imagine the 8950X will be just a single CCX, but who knows. Either they ditch the CCX, or they increase the core count. If we've got another 2x8 core CPU I think people are gonna be pretty mad. I've got a 5900X and I'm hoping to upgrade to an 8950X or 15900k and the choice to go Intel will be super easy if AMD decides to not do anything besides increased IPC and clockspeed. They've not increased cores on the consumer Ryzen CPUs since Ryzen 3000 back in 2019, back then, 16 cores was a lot, now it's not really a lot, it's just a good amount.

    • @everope
      @everope 6 месяцев назад

      8000 series is gonna be laptop and 8000G APUs... So next gen will likely be called 9950x

    • @awdrifter3394
      @awdrifter3394 6 месяцев назад

      @@mttrashcan-bg1ro They could've done dual CCX with 6 cores each.

    • @danijelb.3384
      @danijelb.3384 5 месяцев назад

      Why would they? They want us to upgrade to a 12 core x3d chip. They’re going to milk this, understandably

  • @ryzendave9303
    @ryzendave9303 7 месяцев назад

    do u beleive the 5800x3d to be better choice than the 5900x with a 4070ti

    • @JFinns
      @JFinns 6 месяцев назад

      For gaming yes of course, for mixed productivity/gaming no.

  • @puregarbage2329
    @puregarbage2329 7 месяцев назад +4

    Amd is the 🐐 for cpus, I don’t care what anyone says. 5800x3d going strong almost 2 years later. And only at 100watts 👀

    • @Hi-levels
      @Hi-levels 7 месяцев назад

      Am4 is going strong as well.

    • @dvxAznxvb
      @dvxAznxvb 7 месяцев назад

      Yeah 3d cache is barely bottlenecked at this point in time

  • @lgolem09l
    @lgolem09l 7 месяцев назад +1

    Well, the reason people might not notice shader stutter may be because they start looking for them after most of the shades have been compiled. It's not gonna appear more than once in a game until you reinstall the gpu driver