AMD's 16 Core Gaming CPU - 7950X3D vs. 13900K

Поделиться
HTML-код
  • Опубликовано: 2 окт 2024
  • НаукаНаука

Комментарии • 132

  • @BPSCustoms
    @BPSCustoms  Год назад +8

    Just to make sure I was clear with my explanation - AMD's instructions to media regarding how to set up the 7950X3D for benchmarking should not be required for end users, as a lot of the hoops we had to jump through will be automated inside Windows or provided with updated Ryzen Master software. The only requirements will be that your BIOS will need to be updated and the new chipset driver installed. Apparently though, I wasn't the only one having at least some issues with all of this as I have verified that others also had similar problems. I guess I'm not alone on my island of stupid.

    • @caynofficial
      @caynofficial Год назад +1

      not even the damn xbox game bar?

    • @hadleys.4869
      @hadleys.4869 Год назад

      @@caynofficial from everything I’ve been told is you need the latest BIOS that supports the new 7950x3D CPUs, you need the latest AMD Chipset driver from your motherboards support page, you must set the Windows 11 power plan to “balanced” mode to allow the scheduler to park the unused non v cache cores, and you need to have Xbox game bar installed if you want to leave your 7950x3D at stock parameters and allow the software scheduler decide when a game is being played or not and when to engage the second CCD1 when needed. Now if you disable CCD1(non-v-cache CCD), you probably won’t need the Xbox game bar and if CCD1 is disabled then you can set the Windows 11 power plan back up to High Performance and 100% max and 100% min CPU power. Having the windows power plan set to High Performance in the stock configuration will make it where the non-v-cache CCD cores won’t be able to be parked and will always be active, which would lead to worse gaming performance.

  • @Morne_Smith
    @Morne_Smith Год назад +5

    Interesting to see hoe two CPU manufacturers go into two differing directions. With very close results.

  • @nivea878
    @nivea878 Год назад +3

    i just keep my 13600K

  • @Amfibios
    @Amfibios Год назад +1

    i don't understand why noone compares it to the 13900KS... the best intel has to offer against the best of AMD

  • @Boemtie
    @Boemtie Год назад +2

    Im not really one for all these troubleshooting problems. I just want my chip to work with minimal shit to to myself.

  • @andreim2761
    @andreim2761 Год назад +1

    PLEASE test simulated dual, quad and six core 3D V-cache cpu skus

  • @DannysTechChannel
    @DannysTechChannel Год назад

    Thanks for the Day One review. Love the easy to read charts and they got some nice visuals too. 😉Also glad to see the 3D-V cache chips finally released.

  • @Sircinus3005
    @Sircinus3005 Год назад +3

    Good review. It lets me know I should wait for the 7800x3d

  • @Blacksadify
    @Blacksadify Год назад +15

    While I fully understand why these CPU's are tested at 1080p to put most of the strain on them, pretty much anyone dropping the money for a 4090 and a $700 CPU is going to be playing at 4k, or at a minimum 1440 Ultra. It would be nice to see some benchmarks comparing the CPU's in that range, as that's what people are actually going to be using in real life. If a 7950x3D only gets +4 FPS at 4k vs a 5800x3D or some other lower CPU, it would be beneficial to have those metrics available.

    • @carlthejexican7551
      @carlthejexican7551 Год назад

      Other reviewers have said benchmarks and the difference is just not there and tends to be a waste of time. Even lower end CPUs handle 4k at the same framerate with maybe a negligible 1 to 2 frame difference. 4 frames is even just variance in testing and usually ignored.

    • @Blacksadify
      @Blacksadify Год назад

      @@carlthejexican7551 Agreed, but illustrating what benefits are there for people with those systems would be helpful. If it's simply not worth upgrading for, that's fine, but have that information out there at least. I know it's much less "exciting" to see negligible gains and all, but if that's the truth of the matter, so be it.

    • @wrusst
      @wrusst Год назад

      Dumb comment as all your doing is turning it into a GPU test and making a low powered CPU look as powerful as the 13900 or 3D in a CPU test.
      That data should only be in a 4090 review not a CPU review

    • @Blacksadify
      @Blacksadify Год назад +2

      @@wrusst I disagree, because absolutely zero people are spending $700 on a CPU to use it at 1080p. Real world usage benchmarks are helpful for people to gauge if the CPU in question is actually worth spending money on. It's nice to see them in a CPU bound situation for comparison's sake, but...nobody is ever using them like that.

    • @Blacksadify
      @Blacksadify Год назад

      @@carlthejexican7551 Okay, so in that instance they can illustrate that these CPU's are a total waste of money for those potential buyers. That's the entire point of benchmarks to begin with: to see if the parts are worth upgrading to or not.

  • @FellTheSky
    @FellTheSky Год назад +1

    1080p low settings is perfectly fine to test cpus. A lot of us play games at those settings and they can be very demanding when they are multiplayer (BF5,BF2042,Pubg,Warzone2, etc).
    The thing is those games are hard to benchmark because you can't get repeatable results.
    This tests help us decide if the cpu is worth it or not. Since the rest of single player games can be played with any modern CPU without any issue. Also there is no benefit in playing a single player at 304 fps vs 264 fps. Both are fine.

  • @KhaledPYRO
    @KhaledPYRO Год назад +1

    First time I saw the 3d name I thought it was the APU that ryzen would bury most GPUs with it

  • @ChadKenova
    @ChadKenova Год назад +1

    Seems like disabling one ccd really helps gaming performance and now i know why they didn’t release the 7800x3d at the same time its gonna be faster in games.

    • @profosist
      @profosist Год назад +1

      You just said why they didnt...

  • @dkerchner
    @dkerchner Год назад

    Awesome video. Been waiting for this review!

  • @zeljkoklepac3180
    @zeljkoklepac3180 Год назад

    q=? -- Are you running a POWER PLANT for your PC with 4090 gpu?????

  • @jumpman1213
    @jumpman1213 Год назад +5

    This product is meant for a very, very small audience

    • @robertstan298
      @robertstan298 Год назад

      You mean much people? Yea, basically.

    • @wrusst
      @wrusst Год назад

      Just like a 300w 13900 chip is as well as the 4090 GPU. Halo Vs halo is a fair comparison

  • @bogganalseryd2324
    @bogganalseryd2324 Год назад +1

    it isn't faster , you have to do at least 25-50 games

    • @hadleys.4869
      @hadleys.4869 Год назад +1

      My point exactly and it needs to be compared to a 13900ks with 8000mhz DDR5 32gb ram. The best vs the best.

    • @bogganalseryd2324
      @bogganalseryd2324 Год назад +1

      @@hadleys.4869 Waiting for amd to fix their memory issues so we can use faster memory in our ryzens.

  • @THE1ONENEO
    @THE1ONENEO Год назад

    so at 4k gaming people should stick with there 7950x better for other task too

  • @antoniocepaj7544
    @antoniocepaj7544 Год назад

    Is it that difficult to put 5.0 PCI lanes on next AM5 APU'S ?

  • @Philafxs
    @Philafxs Год назад

    I'm really curious how the 7950X3D stacks up against the 7950X 105W Eco mode in production applications.

  • @Dan-Simms
    @Dan-Simms Год назад

    Good work, I'm still looking to maybe upgrade to the 5800x3d or more likely just wait another generation to do a full new system.

  • @RobBCactive
    @RobBCactive Год назад

    So the reviews of the 7950x3D actually encouraged me to pick up a 5800x3D at discount, that CPU stood up pretty well against the flagships so as I'll never have a 4090 the temptation to do a system rebuild rather than a simple CPU upgrade evaporated.
    That said I was very impressed with the performance at the lower TDP, as suspected benchmark competition has lead to ridiculous inefficiency chasing the last few megahurtz. I saw power consumption histograms and the x3D was half the power at near error margin performance match.
    On 5800x3D the Zen3 stepping improved power consumption significantly, so my upgrade doesn't need more cooling than the 5600x, despite the higher TDP limit.
    As TJ of Gamers Nexus showed in the original benchmarks the cache and slight lower frequency tuning gave the 5800x3D great power efficiency.
    With recent BIOS AMD exceed the specs, a 100MHz on all core doing rendering and the max boost is 50MHz above too. Something that becomes apparent is that at least 7/8 of the cores hit the peak frequency, not just the best 2.
    My 5600x undervolt on 2 best cores was only -10mv with -30mv on the other 4, probably because they go to 4.7GHz rather than 4.55GHz.
    The 16c/32t x3D should be a great chip for those who work & play on their PC, there are some simulation/modelling/CAD applications that really fly with the large cache.
    Having the software support to put applications on a CCX and not have cache bouncing is a bonus. Practically 120W is a lot more comfortable to be around than a 250-300w one that hits 100c, Intel CPU have burnt my hands in laptops by heating up the top!
    Overall I have the feeling the P+E core 8+8 strategy compromises, E cores aren't efficient at AVX2, have to be clocked higher than they should to hit the benchmark targets and have killed Intel's AVX512 on the desktop.
    If AMD do add little cores to the IOD, I hope they're genuinely compact & kept in the sweet spot, to allow main CCX parking and the OS to respond to interrupts and process data in main memory without disrupting threads with higher demands.
    I can imagine a laptop CPU with the asymmetric battery saving benefits but not restricted in main core CCX chiplet. V-cache could be very interesting paired with slower LPDDR5 than the DDR5.
    Without boosting my 5800x3D ran TimeSpy CPU render benchmark without exceeding 40C with 20C ambient, using less than 45w.
    So the tech seems much more flexible and less hacky than the 11-13th gen competition despite Intel's relative success marketing moooaaaahhhhhrrrrrr cores, few enthusiasts seem to realise that the 8p only config beats 8p+8e in some cases because of power and poor performance at some tasks of the SMT less Atom cores, clocked fast and stealing TDP from the main cores.

  • @blothady
    @blothady Год назад

    Thank you for fantastic review. I am wondering if 7950X3D is able to use 65W eco mode.

  • @duladrop4252
    @duladrop4252 Год назад

    11:22 That is a mouthful of insights right there... ^_^.
    But I would probably buy the Little 8 cores XD...

  • @bradscott3165
    @bradscott3165 Год назад

    Really good info. I'm subscribing because I don't game but I am interested in relative performance and your review didn't make me watch all that stupid and violent videogame footage. Excellent job! PS: Is it really faster to completely disable those other 8 cores rather than letting them contribute?

  • @_nom_
    @_nom_ Год назад

    There's a video on overclocking the 7900x3d, they got a pretty decent clock. But I'm curious if it's stable for these benchmarks.

  • @christianprice9832
    @christianprice9832 Год назад +2

    So an oc 13900k with 7200 memory would essentially perform identical in games and win in productivity. Why did amd put so much hpye on this cpu?

    • @lelouchabrilvelda1794
      @lelouchabrilvelda1794 Год назад +2

      And using 500watt and getting the best cooler on the market to imaging that intel can beat amd hahahahaah
      Good luck intel fanboy.

    • @choatus
      @choatus Год назад +2

      @@lelouchabrilvelda1794 using the exact same cooler.... and not much more power, you could even get 10-15% more out of the Intel chip if you overclock the cpu and optimise the ram even more... The AMD X3D cpu's are dead in the water. what a stupid release LOL

    • @blkspade23
      @blkspade23 Год назад

      @@choatus You say optimize the ram, but that really means having paid for even more expensive RAM. Plenty of people were up in arms over the fact AM5 only had a DDR5 option because DDR4 was so much cheaper. Now the argument is buy the even more expensive RAM and screw with it. Could just accept "losing" by only 5% for less money and time. Having invested in DDR4 LGA1700 would have left more performance on the table eventually. The next meaningful performance jump on Intel is absolutely going to require a new motherboard, while the next 2 iterations of AM5 chips won't. V-cache clearly works past the deficit in memory bandwidth. We all witnessed an EOL product in the 5800X3D extend the life of a 5 year old platform and remain competitive with newer offerings. Yet somehow it makes more sense to dump the most money into a whole EOL platform, to basically tie the one with an actual future. There are games where the cache benefit is meaningful as opposed to just scientifically better. Things like MSFS, and very likely Star Citizen where the boost is actually needed. How much is there to care about 200+ FPS on dead end single player engines.

    • @choatus
      @choatus Год назад +1

      @@blkspade23 A die isnt very expensive if you know what kits to get, so your argument is just meh...

    • @choatus
      @choatus Год назад

      @@blkspade23 14th gen is coming on this exact platform , so it isn't end of life at all...

  • @claudep.1926
    @claudep.1926 Год назад

    Can't wait for a follow up video on non-gaming application. Yeah there are other reviewers, but you are 100% unbiased and I appreciate that.

  • @Tainted-Soul
    @Tainted-Soul Год назад

    Thanks for your hard work as Your mistakes will help use normal people :) thanks I have made notes . and have already updated my MB bios and downloaded the chipset drivers from Asus :) win win im ready just to need to get one

  • @BuzzKiller23
    @BuzzKiller23 Год назад

    I don't know why RUclips didn't send me a noti for this and I'm a little upset. I love the channel Brian!

  • @TheHighborn
    @TheHighborn Год назад +1

    Can you please make a video on gaming and streaming at the same time? i'm concerned about windows sheningans -> turning off half the cores effectively. So, if i can't game and stream, it's just a bad CPU (on windows)

    • @rickpostma4064
      @rickpostma4064 Год назад +2

      Most streamers are using NVENC as option instead of streaming on there CPU. so your CPU usage will be very low with streaming.

    • @TheHandHistoryVault
      @TheHandHistoryVault Год назад

      @Highborn. Rick isn't mentioning the 1% lows of having a higher core count cpu will give you. OBS uses resources, notifications and such. That's not also mentioning browser you will most likely have up. So on and so on.

  • @geekdoh
    @geekdoh Год назад

    That outro almost sounded like this to me --- "Don't let the subscribe button hit you in the ass on the way out.... " 😀

  • @hangemhi001
    @hangemhi001 Год назад

    outstanding

  • @sascenturion
    @sascenturion Год назад

    Excellent and honest review 😀👍
    Well done, thanks!

  • @jamesmiscellaneous
    @jamesmiscellaneous Год назад

    You're very honest. Good job sir.

  • @Lenticular67
    @Lenticular67 Год назад +1

    Yours was the first video to hit my alerts and it did not disappoint. The thermals were the biggest surprise to me, and well done, Brian!

  • @lickrish3930
    @lickrish3930 Год назад

    Thanks for these gaming benchmarks for the people that play at 1080p still

    • @Amfibios
      @Amfibios Год назад +2

      i hope you're joking

    • @hadleys.4869
      @hadleys.4869 Год назад

      @@Amfibios buying a 4090 for 1080p gaming is like buying a Ferrari and driving it from red light to red light as fast as you can. Pointless. 4k should be the target resolution for a 4090. It’s 4x the detail of 1080p and just looks so much nicer. Especially when you invest in a 120hz or faster 4k monitor.

    • @Amfibios
      @Amfibios Год назад

      @@hadleys.4869 what 4090? this is a CPU comparison

  • @WhiteVenum
    @WhiteVenum Год назад +5

    Great review dude! Hard pass on the 7950x3d. I'll be getting a 13900ks and calling it a day. I'm not dealing with all that set up.

    • @christianprice9832
      @christianprice9832 Год назад +3

      Same here. Pair it with 7200 memory and you'll probably be beating the 7950x3d.

    • @WhiteVenum
      @WhiteVenum Год назад +2

      @@christianprice9832 I plan on getting the ek direct die block for it and the best ddr5 I can get. I hear they clock to 6.2 with light work

    • @lelouchabrilvelda1794
      @lelouchabrilvelda1794 Год назад

      @@WhiteVenumThe 7950x3d is fastest cpu check the review on 13900ks vs the x3dv
      And the difference is huge interm of power efficient and low cooling.
      Amd destroy intel easily.
      And intel is a dead platform soo enjoy :)

    • @WhiteVenum
      @WhiteVenum Год назад +1

      @@lelouchabrilvelda1794 I just watched its review on performance it's nothing special. I don't just game on my pc and I'm not going through 6 different set up procedures and making sure they are up to date to make this thing work right. I'll take the Intel chip and see what I can do with it. I won't be surprised if I can make it outperform the amd chip in everything and if it doesn't oh well 🫡

    • @zhardy323
      @zhardy323 Год назад

      You dont need to do this setup. Its only for reviewers cause the drivers and bios arent available to the public. But if you dont game these cpus arent for you

  • @MTDucas
    @MTDucas Год назад

    The percent of 7000x 3d over Intel or non x3d chips in 1080p means nothing because almost all buying this chip will play at 1440p at least....more over it shows the x3d chip be better than it is at the realistic resolution people actually play on hy end computers

    • @Amfibios
      @Amfibios Год назад

      you're missing the point. here we're comparing CPUs. if you're gonna run your game in 8K resolution and be severely bottlenecked by your GPU, it's a whole different story.

    • @MTDucas
      @MTDucas Год назад

      @@Amfibios and you are missing the words....at least 1440p,we are in 2023 after all

    • @Amfibios
      @Amfibios Год назад

      @@MTDucas i'll say it again.... we're testing *CPU* performance, thus trying to remove all other bottlenecks. we're not testing the fps difference at your favorite resolution.

    • @MTDucas
      @MTDucas Год назад +1

      @@Amfibios by showing a 3d cpu having 10-15% more performance over other cpu's at 1080p means nothing if the same tests show a 5% difference at 1440p or nothing at 4k.....people buying these expensive parts will never game at 1080p....which means tests at 1080p are irrelevant for them at least....and are made just to show bigger numbers of improvement.....real battle is at 1440p and 4k these days just saying

    • @Amfibios
      @Amfibios Год назад

      @@MTDucas i agree with everything yous said. the reason i said you're missing the point is because we're comparing apples to apples and either way you see it, the x3d chip is better.
      your argument is that even if it's better, it's pointless because at x resolution you won't see much or any difference.
      the game's logic is ran on the cpu and your cpu is giving you a hard fps cap. if you test a game @ 1080p and you get 200fps, that's the maximum your rig will ever do regardless of what GPU you'll slap at it.
      nowadays with new graphics cards (and to some extent DLSS 3.0), serious gamers are aiming for higher fps. There's monitors that can push 500hz...
      Some games are more cpu intensive than others and if you combine them with high polling rates on peripherals, your cpu can easily take a performance hit.
      graphics cards are getting better and better at a faster rate than CPUs also, so for people that don't change PCs every 2-3 years the CPU can give them a lot if headroom so that the new GPU upgrade will be able to stretch its legs

  • @m4nc1n1
    @m4nc1n1 Год назад

    I am sure I speak for most people when I say we do not care about a $700 "Gaming" chip

  • @666okano
    @666okano Год назад +1

    Basically we have a decent CPU that can do 10FPS more at 1080. Its hardly the king of CPUs. But il let the AMD have there shout, its been awhile :) But come on its like a car getting 2 more BHP. Honestly nobody cares.

    • @yifuj
      @yifuj Год назад

      It ties the 13900KS at productivity at the same price, beats it for gaming, uses half the power, supports PCI5 SSDs coming out next month, and will be upgradeable to the 32 core 3nm Zen 5 with its complete architecture rework unlike dead end LGA1700. Seeing that the 7950X3D is sold out pretty much everywhere, it seems that many people do care

    • @hadleys.4869
      @hadleys.4869 Год назад

      @@yifuj now that’s just plain delusional. In a large selection of games the two CPUs will come to a wash with neither winning overall. This is especially true for more real world resolutions being used with a 4090. When it comes to productivity apps the 13900k does better than a 7950x3d in most productivity apps except for maybe 2-3 apps. Heck the 3D is on average 5% slower in productivity apps than a 7950x.

  • @keithplayspc
    @keithplayspc Год назад +1

    Didn't compare to the Apple M1, fail. delete this video now

    • @BPSCustoms
      @BPSCustoms  Год назад +1

      RUclips might just do it automatically at the rate this is not accumulating views

  • @TheManicGeek
    @TheManicGeek Год назад +1

    Best 7950X3D Gaming review NA

    • @BPSCustoms
      @BPSCustoms  Год назад

      Full disclosure I paid him to say this

  • @Kapono5150
    @Kapono5150 Год назад

    Anyone with a 5800X3D is GOOD 👍

  • @Unobserved07
    @Unobserved07 Год назад +1

    Lol on the performance summary. The average looks like it does there because in 1/8 games the x3d had a 50 fps advantage... At 1080p low. Testing at 1080p is fine for cpu benchmarks but testing with Low settings is assanine. Several graphical settings actually put more work on the CPU, so while lowering the resolution to remove the gpu bottleneck is wise, setting the settings to low is actually going to skew the results. Secondly, if you're going to only benchmark 8 games, at least throw out the 1 outlier for the averages. The performance summary average results are massively skewed because 1 game has a 50 fps advantage.
    Just editing this in: pairing a 13900k which has to run ram in gear 2 with 6000mt ddr5 (slow ram especially if using xmp) and testing it against an AMD zen 4 chip running 6000mt ddr5 expo (which is equivalent to running in gear 1) gives a MASSIVE advantage to the AMD chip. Mainstream reviewers either don't understand how ram works or they're purposely misleading the audience.

  • @Theworthsearcher
    @Theworthsearcher Год назад

    You just forgot to mention the power consumption. ;) It is better in gaming, than the 13900K and uses way less power. ;)

  • @apl1393
    @apl1393 Год назад +1

    As a gamer, I don't see this being a viable purchase. It's at least £150 more than the 13900k, (unable to find any UK prices for 7950x3d) and then there's the expensive motherboard. It does however run at 60% power of the 13900k and has a few years of life in the AM5 platform whereas the Intel I believe is in it's last stages.
    I'll wait till April to see what the 7800x3d brings.

    • @wrusst
      @wrusst Год назад

      Mother boards aren't even that expensive anymore and if it was gaming only the 8 core will still exceed the 13900 based on tests disabling the 5950x3d non v cache ccx

    • @apl1393
      @apl1393 Год назад +1

      @@wrusst That depends on which motherboard you wanted but you wouldn't buy a cheap MB with this CPU. You've sort of agreed with my point, you wouldn't buy this CPU only for gaming. Who'd pay all that money for a CPU then turn off half the cores?

    • @wrusst
      @wrusst Год назад

      @@apl1393 you wouldn't buy a cheap board for a 13900k as it pulls even more wattage either in comparison.
      Who would do that ? The type of person who bought the 13900ks who wanted the very best gaming CPU possible and wanted an upgrade path for 2 or more chips now intel socket is dead .
      If you got 1600 dollars for a 4090 you got an extra 50 bucks for a board

  • @TSEDLE333
    @TSEDLE333 Год назад

    Every time that Intel CPU 'won' it was consuming 70W+ on that 79503D....100W+ if another comparison I saw is considered...
    And that 'incompatibility' with the software? Thats on MS shitty OS. Hardware dont function withotu software, and that 'soft' is the WEAKEST link EVERY SINGLE TIME....because its the ONE part of the equation that has little to no worry to evolve...
    Yes MS Im looking at that shoddy f*cked over old AF code right now...

    • @jjlw2378
      @jjlw2378 Год назад +2

      My 13900k at 5.8p/4.5e/5.0ring uses between 70w-150w when gaming. Avg, probably 95w just depends on the game.

    • @hadleys.4869
      @hadleys.4869 Год назад +1

      I could give two sh*ts about using more wattage if the gaming experience is smooth with good 1% lows at higher resolutions. I’ll stick with my 13900ks and 7800mhz DDR5 32GB ram kit and 4090.

  • @kevinferrell8237
    @kevinferrell8237 Год назад

    GIMME DAT!