24-Core Battle: AMD Threadripper Pro 5965WX vs Intel i9 13900k | Which is better for creators?

Поделиться
HTML-код
  • Опубликовано: 3 ноя 2024

Комментарии • 208

  • @theTechNotice
    @theTechNotice  Год назад +46

    There are 2 minor mistakes in this video:
    1. 7:51 the single core score number of 5965wx should be 1365 not 2231. The percentages are still correct though.
    2. 12:49 Davinci Resolve benchmark there are a few '-%' that should be RED instead of green.
    Thanks for pointing them out guys! 🙏😇

    • @jhu2860
      @jhu2860 Год назад +2

      Also at 7:51 Your multi value for 13900K is 25469 and the 5965WX is 25469 yet percentage gain on 13900K is listed as 1.90%

    • @adamtajhassam9188
      @adamtajhassam9188 Год назад +2

      simple : cost intel wins , performance amd barely makes a win

    • @C0smicThund3r
      @C0smicThund3r Год назад +1

      And at 2:53 where Zen3 and Zen4 are listed as Process Nodes but they're actually just architectural codenames. The process nodes are TSMC 7nm and TSMC 5nm.

    • @josephdurnal
      @josephdurnal Год назад

      yeah, I saw the green -% numbers, figured that was one of the mistakes

    • @Weckmuller
      @Weckmuller Год назад

      That is very confusing for someone trying to read the data.

  • @nightdonutstudio
    @nightdonutstudio Год назад +41

    During Davinci benchmark, the negative number should be in red but instead it is in green. If this is the wrong part you were refer.

    • @theTechNotice
      @theTechNotice  Год назад +10

      Thanks for pointing that out. Made a correction in the pinned comment! ;) 👍🙏

  • @Real_MisterSir
    @Real_MisterSir Год назад +12

    I love my 13900K, work primarily in Cinema 4D and Photoshop (and some Davinci here and there), and its the best money I've spent in recent years. One thing often overlooked, is the power consumption during medium and light loads, because sure when I'm running sick renders for hours it does eat a fair bit of power, but in my overall workflow I don't record more than 5-10% of high load. Most of the efforts spent during modelling, painting (I use Photoshop as a design tool more than a photo editing tool), and creating various assets, the 13900K sips power. The e-cores really are doing an amazing job, under normal load it uses the same power as my old cpu did during idle, which is more or less the same as the current day Ryzen chips. In my experience it's only when you push past 100-150w that the scale starts turning around, but overall I save so much power during the 90% of my workload that isn't pushing the cpu to its high capacity.
    I regard it more as a hyper efficient cpu for common workload, that then calls in the 24 core cavalry when important battles have to be fought :D

    • @danielvipin7163
      @danielvipin7163 Год назад

      Windows 11 or 10?

    • @alderlake12th
      @alderlake12th Год назад

      @@danielvipin7163 Windows 11 and Windows 10 does a little
      Windows 10 21H2 also support e cores, but windows 11 do it better

    • @Real_MisterSir
      @Real_MisterSir Год назад +1

      @@danielvipin7163 Win10, but afaik the differences are more or less nonexistent at this point. It was only the first month or so that there were big differences in the hybrid core utilization

    • @Filmmaker809
      @Filmmaker809 Год назад

      How much DDR5 memory you running?

    • @Real_MisterSir
      @Real_MisterSir Год назад

      @@Filmmaker809 32gb. It's low for my liking but waiting for 48gig sticks to come down in price and jump to 96 (mostly 3D work benefits here for my tasks)

  • @a120068020
    @a120068020 6 месяцев назад +1

    For my 13900k and 14900k I have done the following: MCE off, PL1 and PL2 limit to 225, limit P-core boost to 5.5 GHz and E-core boost to 4.3GHz, and use balanced power profile in Windows (although I do disable core parking to keep system highly responsive). Oh and just XMP on the RAM. I didn’t change LLC value. I have set voltage offset at a modest -0.015v and set the Core limit to 300 Amps. I have disabled the C6&C7 C states and EIST. Lastly I have locked AVX at 0 offset. I have tested on P95, CB R23 and CB R15. All great and in a mid 20 degree room, no workload exceeds 80c on package or cores. Very happy and benchmarks are very close to where they were before taming these beasts.

  • @G4ggix
    @G4ggix Год назад +6

    If I can point 1 thing that so far nobody said is that Threadripper, like Xeon, are built for stability, to stay operative 24/7 so if you're a creator that use Adobe package go to I9/R9 while if you working on 3D packages and you rendering (Maya, 3Ds Max), simulating (Houdini), game designing/virtual production (Unreal Engine/Omniverse) you go to Tr/Xeon with ECC RAM because you dont want your software crash or your pc bluescreen (yes, I'm having this problems with my R9 3950X). Great videos tho!

    • @normal_human
      @normal_human 11 месяцев назад

      Just blame every minute of lost sleep on Adobe.
      You'll sleep better.

    • @G4ggix
      @G4ggix 11 месяцев назад

      @@normal_human I don't think I understand your comment, I'm using Photoshop and Premiere Pro sometimes and I don't have stability issues with those softwares. Where did I lose my sleep exactly?

    • @normal_human
      @normal_human 11 месяцев назад

      @@G4ggix Sure you do. Everyone does.

    • @G4ggix
      @G4ggix 11 месяцев назад

      @@normal_human If you say so..

  • @ExitOnTheInside
    @ExitOnTheInside 11 месяцев назад +1

    DAW 1st , Editing 2nd Gaming 3rd . . . . pleased I've just snapped up a Mini-iTX z790 ; The build begins! Thanks for this

  • @bernardsantos210
    @bernardsantos210 Год назад +18

    I couldn't handle my system issues anymore during heavy After Effects workloads, which is every single project. So I jumped on the 5965wx from Puget. Probably gonna pair it with 2 4090s for a multi monitor setup. But I badly need a stable system with better storage and pcie capabilities.
    Thanks for the review!

    • @streetDAOC
      @streetDAOC Год назад +3

      Nobody cares!

    • @Zangetsu_X2
      @Zangetsu_X2 Год назад

      "I couldn't handle my system issues anymore"

    • @nitin_music
      @nitin_music 11 месяцев назад

      @bernardsantos210 are you from USA? 😂

    • @SupraRy
      @SupraRy 2 месяца назад +1

      He literally just made this comment to try and flew but nobody cares.

  • @julianfiacconi709
    @julianfiacconi709 7 месяцев назад

    Very informative. Thank you for this. Must have taken a lot of time and effort to compile these results, much appreciated.

  • @Rajanarulmani
    @Rajanarulmani Год назад +2

    Thanks for the detailed comparison bro .❤

  • @clausbohm9807
    @clausbohm9807 Год назад +22

    Too late, just finished building my i9-13900K system but nice to know. Great videa as always! One of your other videos helped refresh my mind on computer builds. Price per performance for most applications ... i9-13900K wins out if you dont mind hot and power hunger. I wish you would have measured the i9 at stock limited power of 253W since that is the better way to handle that chip. Plus actions can be handled by the GPU even in Vray. So I would save my money on the cpu and upgrade the gpu.

    • @lolohasan6424
      @lolohasan6424 Год назад +7

      I have the 7950X3D. I stream at twitch on 1080p placebo OBS setting when 4K ultra gaming with RTX 4090 at the same time. İt will run better with new next gen Nvidia flagship cards.

    • @CyberneticArgumentCreator
      @CyberneticArgumentCreator Год назад +6

      Being on a newer platform and the fact that the 13900k wins in nearly every time-sensitive application - video playback, DAW work with typical mixes, etc - means you made a good choice. The 13900k doesn't really get that hot unless you're blasting it in cinebench as long as you have a good cooler. I have a 13700k overclocked to 5.5GHz and don't go above 110W with it while gaming, streaming, youtube video, discord and so forth running at the same time. So it sits at 50-55C under load. The TDP is not in any way representative of how much the 13th gen Intel chips actually use in typical workloads.

    • @lelouchabrilvelda1794
      @lelouchabrilvelda1794 Год назад

      ​​@@CyberneticArgumentCreatorccording to TECH YES CITY the 13900k and S series is the worse latency in workstation.
      He downgrade to 10gen because of that issues.

    • @25rodrigomg
      @25rodrigomg Год назад

      @@CyberneticArgumentCreatorexactly! I’ve been rendering 3D stuff on corona renderer with 100% utilization at 270w and 85°C

    • @Faithfps
      @Faithfps Год назад +2

      I turned off multi core enhancement in bios (was gonna undervolt but didn't) and my max temps on 13900k is 60c

  • @gheffz
    @gheffz 5 месяцев назад

    Love your style and delivery... great tech channel.

  • @joeyjojojr.shabadoo915
    @joeyjojojr.shabadoo915 Год назад +4

    1) AM5 CPUs actually have 28 PCIe Lanes, with 4 connecting to the Chipset and 16+4+4 General Purpose (or 8+8+4+4 w/Bifurcation). You are still only guaranteed 16+4 Lanes as the other x4 may be used for onboard high speed peripheral connections, although MAY be left vacant for a 2nd CPU-Direct connected NVME. Depends on Motherboard/Manufacturer.
    2) Intel LGA1700 CPUs have the equivalent of 28 lanes with 8x DMI lanes from CPU to Chipset and 16+4 General Purpose (or 8+8+4 w/Bifurcation). Intel chipsets also support far more high speed Lanes with a mix of Gen4/3 on Z690 and Gen 5/4 on Z790.
    For Workstation, Choose AMD. For high End Consumer Desktop and PCIe Connectivity/Performance, Choose Intel. For 2nd Tier (B650 AMD, B660 Intel), Choose AMD. If you don't care about PCIe Lanes, choose whichever you like.

    • @zenithchan1646
      @zenithchan1646 Год назад

      What am5 motherboard that have 28 PCIE lanes?

  • @CameraGearGeek
    @CameraGearGeek Год назад +1

    Thank you for your benchmark comparison. There were almost no benchmarks of threadripper + lightroom at all.

  • @shanent5793
    @shanent5793 Год назад +4

    The TR Pros have varying amounts of L3 cache: 12-16 core 64MB, 24-32c 128MB, 64c 256MB. It's a shame there's no lower core count 256MB versions, but they probably wouldn't be any cheaper considering what they charge for the frequency-optimized EPYC 7003 series. I'm gonna try the 7950X3D, 7900XTX, 100GbE Chelsio NIC, and two PCIe 4.0 Optanes. The CPU, RAM, and motherboard are about ⅓ the price of the 5965wx platform though though it's also ⅓ the memory bandwidth, provided 4x32GB sticks of ECC runs at 4800Mbps. Kingston has some Hynix M-die ECC UDIMMs that are QVL for 4800 but they're on backorder, so I'll try the Crucial/Micron A-die for now, and we'll see what the recent Agesa update brings

    • @lolohasan6424
      @lolohasan6424 Год назад +2

      I have the 7950X3D. I stream at twitch on 1080p placebo OBS setting when 4K ultra gaming with RTX 4090 at the same time. İt will run better with new next gen Nvidia flagship cards.

    • @shanent5793
      @shanent5793 Год назад

      @@lolohasan6424 good to know, but it's gotta be Radeon because of the application. If there were more PCIe lanes it could have both

  • @OficinadoArdito
    @OficinadoArdito Год назад +8

    For render, if you are rendering on GPU, have a max of 2 GPUs, the 13900k or the 7950x with good chipsets are better options because in modelling the single core performance is more important, and it is the part that consumes most of my day. I would go for Threadripper only if I need more GPUs.

    • @theTechNotice
      @theTechNotice  Год назад +1

      Very good conclusion!

    • @clausbohm9807
      @clausbohm9807 Год назад +1

      Thats exactly what I am doing with two suprim hybrids and i9-13900K. best performance per $$$ ... and no OC needed with that setup! Blender gooseberry with only one card 15 seconds per frame.

  • @halledwardb
    @halledwardb Год назад

    Hey, thanks. Just didca 78003dx, and a i9 13900k this week. Love both of them.

  • @450aday
    @450aday Год назад +8

    Threadripper is a class above and cost of the Workstation build is way higher (like the cost of a new car). don't get Threadrippor unless you really have the technical expertise to know why. I am very happy with my i9 13900k with twin NVlinked 3090's (that i was lucky to get at a fair price from ebay). it is a powerhouse, not cheap, but not the price of a new car either. i have not even gotten around to using all the performance boosting features and have not had any reason to do so.

  • @BloodiTearz
    @BloodiTearz Год назад +1

    I'm usnig a high binned 13900KS at 6.2, this cpu is a beast even for production and power consumption.

  • @abaj006
    @abaj006 Год назад +4

    Have you noticed the stuttering issues with your 13900k, as reported by Tech Yes City?

    • @mcalin5931
      @mcalin5931 Год назад +4

      ssshhh, this is an Intel plug video

  • @nhanNguyen-wo8fy
    @nhanNguyen-wo8fy Год назад +3

    On premise server, 3d model, fluid dynamic simulation, machine learning.
    That's what these threadrippers are for, not a Photoshop box 😤
    At my roommate's job, his department share a single server with threadripper 3000, 32 core, 4x Radeon w6800 pro. I don't understand well what his job really is but it's sth to do with steel structure in construction.
    Intel 13900k with rtx4090 don't even have a chance with that old machine.

  • @ScottAshmead
    @ScottAshmead Год назад +2

    you mention the memory bandwidth comparison but didn't show how effective it is in this comparison if you were to add the extra sticks on the threadrippers so not sure it is a fair comparison if you are limiting the threadripper .... adding the extra sticks on the other non-TRs will limit their performance because of their mem-controller but not utilizing the extra performance you can get on a TRs just leaves me thinking that I'm not getting the full story.

    • @Hansen999
      @Hansen999 Год назад +1

      Yeah it doesn't really make sense.
      For reference my Epyc 7302 gained 800 points in Cinebench R23 (running in a VM with 14 cores allocated) just by populating all 8 ram slots.

  • @cemilerkoc4956
    @cemilerkoc4956 Год назад +1

    What do you guys think about the HP Omen GT22-1957nz (Intel Core i9 13900K, 64 GB, 4000 GB SSD, Nvidia GeForce RTX 4090)?

  • @ШотаГоголи
    @ШотаГоголи Год назад +1

    For 6590.85 lari I bought an Intel Core i9 13th generation computer. RAM size :
    64 GB solid-state drive capacity :
    Graphics processor with a capacity of 4 TB :
    Intel ® HD Graphics 770 Motherboard Model :
    ASUS ROG Matrix Z690-E Gaming WiFi 6 ELGA 1700
    AMD RYZEN THREADRIPPER PRO 24 CORE PROCESSOR 5965WX 3.8GHZ AMD is worth -7971.84 lari

  • @besssam
    @besssam Год назад +1

    @thetechnotice non of the benchmark tools used in this video will show the power of the 8 channel memory on threadripper. You need to use computational fluid dynamics cfd bench tools and a really large cfd file and only then you will see how much of a great monster the threadripper cpu really is

    • @akyhne
      @akyhne Год назад +2

      Unrelated!
      This video was about the best CPU for creators, not for scientists.

    • @besssam
      @besssam Год назад +2

      @@akyhne engineers are creators, we actually design and create things.

    • @akyhne
      @akyhne Год назад +1

      @@besssam It doesn't really matter, that you make up your own meaning of the phrase "content creator". The video is not for you.
      If you need more focused tests towards engineering and scientific calculations, go to Level1techs RUclips channel. He does plenty of testing with Threadripper, Epyc, Xeon and alike testing, for your needs.
      This channel is more geared towards content crrators. Just as Gamers Nexus is geared towards gaming.

    • @mcalin5931
      @mcalin5931 Год назад

      and how will that boost Intel sales?

  • @Pc_Gaming8604
    @Pc_Gaming8604 Год назад

    @Technotice
    7:59 the difference in the processor
    13:08 resolve the percentage is green

    • @theTechNotice
      @theTechNotice  Год назад +1

      Thanks for pointing that out. Made a correction in the pinned comment! ;) 👍🙏

  • @raredreamfootage
    @raredreamfootage Год назад +1

    Thanks for this comparison, I was wondering myself if I should get a 64 core lower clock speed or a high clock speed 16 / 24 core.

  • @Action-Editing-Pro
    @Action-Editing-Pro Год назад +1

    What you think about threadripper 7000 serie coming ?
    For workkstation

  • @QuentinStephens
    @QuentinStephens Год назад +5

    One glaring mistake is that you did not include the direct competitor to Threadripper Pro which is Xeon W. Let's see you do a Threadripper Pro vs Xeon W (and possibly vs Epyc) test. No faffing around - let's have 512 GB RAM, multiple NVME drives, 10+ Gb ethernet, and so on.

  • @kevinnapier8996
    @kevinnapier8996 9 месяцев назад

    I think the mistake was some negative indicators were actually in green even though they should have been in red. This was in the segment at minute 13.

  • @hybrid1880
    @hybrid1880 Год назад +2

    I’m surprised that you haven’t covered latency in your reviews. Just benchmarks.

  • @vasyaadamov8304
    @vasyaadamov8304 11 месяцев назад

    Thank you!

  • @whitehavencpu6813
    @whitehavencpu6813 Год назад +2

    7:59 - I guess one mistake here is the Geekbench scores?
    The single core scores that you displayed for the 5965wx, 7950x and 13900K are all the same at 2231, yet the difference that you highlighted is different for each of them.
    Then the Multi scores for the 5965WX vs the 13900K...
    You scored the same at 25469 for both CPUs, but highlighted a 1.90% difference for the 13900K, I'm assuming you pasted wrong score for the 13900K.

    • @whitehavencpu6813
      @whitehavencpu6813 Год назад

      12:43 - Hehe score coloring in the Pugetbench for Davinci Resolve, namely the 4K and 8K media tests. 13900K and 7950X numbers should be red, not green.

    • @theTechNotice
      @theTechNotice  Год назад +2

      Thanks for pointing that out. Made a correction in the pinned comment! ;) 👍🙏
      You've got a good eye!

    • @whitehavencpu6813
      @whitehavencpu6813 Год назад

      @@theTechNotice No problem and solid vid either ways! Barely much content covering these HEDT semi-server chips so i'm glad i subbed to ya ❤👍

    • @I_ammm_mojojojo
      @I_ammm_mojojojo Год назад

      @@theTechNotice also at 1:38 "..it's got 2 cores per thread".. should be "2 threads per core"

  • @MarchIdes-xd8pq
    @MarchIdes-xd8pq Год назад +1

    (Not an expert)
    (Not an Intel employee/fanboy/shill)
    Interesting how no mention of Intel's CET in its 12th/13th gen chips. That feature alone, gives Intel the lead, considerably in my opinion. To my knowledge AMD doesn't really have anything like that. Unless someone can link to documentation stating otherwise.
    All of the tech channels seem to overlook it, and I think it's something to consider. If you are not playing AAA games often and do not plan on streaming than you really don't need alot. I think even if you do, Intel is still the best way to go.

    • @FM19086
      @FM19086 Год назад +1

      Since that's a security feature, and these channels only cover the numbers "cores, count, speed, fps, etc etc etc".
      I agree though and you make a valid point. I think Intel's CET in their newer processors (12/13 gens) are really worth considering when buying a processor today in 2023.

  • @tomsun3159
    @tomsun3159 Год назад

    @11:00 Premiere Pro the last 2 values for the 7950x are wrong or wrong coloured
    The other 2 were already mentioned

  • @pamus6242
    @pamus6242 Год назад +1

    All these 4 chips are friggin amazing. Its all about price, ego and the intended use.

    • @damonm3
      @damonm3 Год назад

      Or two of the three… ego should never be a factor. Or a self reflection and opposing factor in enforcing one of the other two… logic for the win hopefully haha

    • @pamus6242
      @pamus6242 Год назад

      @@damonm3
      It's buoyancy of all three existing in harmony... Like the tri state of water.

    • @damonm3
      @damonm3 Год назад

      @@pamus6242 eh, I think ego isn’t essential or beneficiary in any way. So disagree and I’m set on proving it!!! (Ironic joke 🍻)

  • @_B.C_
    @_B.C_ Год назад

    Wish you would add some Touchdesigner or Resolume benchmarks to your testing.

  • @mahpooya-damavand
    @mahpooya-damavand Год назад

    Hi
    Are you planing to make another video for new Threadripper?

  • @nnasab
    @nnasab Год назад

    5965WX is a workstation cpu, it’s a workhorse for heavy calculations like cad and cam design or scientific calculation or simulation.

    • @superspies32
      @superspies32 Год назад

      And still has no Dual Threadripper motherboard for it. A horrible crime

  • @Albert-III
    @Albert-III Год назад

    The table at 8:24 seems to be incorrect. Single core values are the same for three of the cpus
    Edit: I saw the pinned comment

  • @evanlauber
    @evanlauber Год назад +1

    my 13900k works with all 4 dimms at 5200 mhz very stable not sure where you get the 4000 mhz from

  • @gheffz
    @gheffz 5 месяцев назад

    Looks a good "cheap" server option for a "small" ESXi installation.

    • @gheffz
      @gheffz 5 месяцев назад

      I didn't see the mistake (or plant?)... I was working and listening to you at the same time and occasionally looked at the video. Great video, as usual, by the way.

  • @starman7906
    @starman7906 Год назад

    8 memory channel is the most important for threadripper mainly for CAE simulation workload. otherwise consumer grade 2 channel i9.

  • @FreightFox
    @FreightFox 6 месяцев назад

    I own and use both a Threadripper 5975WX and a 13900KS CPU.

  • @ArtBuddyIndian
    @ArtBuddyIndian 4 месяца назад

    What is the intel equivalent of the 5965 WX processor

  • @L3nny666
    @L3nny666 Год назад

    you would only want the 5965WX for the pcie lanes or ecc memory...otherwise a waste of money.
    For productivity you either go with the best of the consumer chips for best price to performance or you just go with the 64 core if you need raw power. all the threadrippers in between are pretty much irrelevant if you ask me. i mean sure for someone it might hit the sweetspot, but if you use 32 or 48 cores, chances are pretty high you could also put 64 cores to use.

  • @warcobra3953
    @warcobra3953 Год назад +2

    Im guessing Typos for the Geekbench 5 results.

    • @Sutlore007
      @Sutlore007 Год назад +1

      2231 all the way lol

    • @theTechNotice
      @theTechNotice  Год назад

      Thanks for pointing that out. Made a correction in the pinned comment! ;) 👍🙏

  • @maxhughes5687
    @maxhughes5687 Год назад

    I have three ASUS quad cards and one Highpoint quad card. They need 4 PCIe sockets with 16 cpu lanes each. OCTA Channel ram 3200C14 has four times the throughput of 3200C14 dual channel ram on a PC. Old software written for 4 core Intel wants faster clock speed, not a higher core count.

  • @maxwellsmart3156
    @maxwellsmart3156 Год назад

    The TR supports RDIMM ECC and is for 24/7 processing. You don't buy this for single core software unless you will run VM's.

  • @Martin-rv5is
    @Martin-rv5is Год назад

    what happends in games ?

  • @computerenthusiast402
    @computerenthusiast402 Год назад

    What is the Best Value Case, CPU, and Cooler for Creators ?…….Builders need a Foundation on where to Start building a Good computer. Case and CPU Cooler is a Good Starting point.

  • @AngelLuisTrinidad
    @AngelLuisTrinidad Год назад +3

    You don't compare price with performance, so how can you tell the best CPU for the best bang for the buck?

    • @FaceyDuck
      @FaceyDuck Год назад +1

      That’s why people like GamersNexus exist

    • @theTechNotice
      @theTechNotice  Год назад +2

      Sorry I'm not sure I'm following what you're trying to say? :)

    • @AngelLuisTrinidad
      @AngelLuisTrinidad Год назад

      You don't compare price of the CPUs versus their performance. Bang for the buck. I don't see price in the compare performance. Even at the conclusion you do not mention price.

    • @theTechNotice
      @theTechNotice  Год назад +1

      Well... they're in such a different class and use purpose that it very much depends on your use case if you need pcie lanes the TR is the way to go, but if you need fast CPU then the 13900k makes much more sense and is much better bang for the buck.
      It's like comparing the fuel consumption of a bus and an SUV if that makes sense. Depends what you need ;)

    • @akyhne
      @akyhne Год назад

      Price to performance doesn't really matter.
      If you're a company, youd never go for the consumer CPUs, with no ECC support.
      If you're a privaye, youd never go for the Threadripper, unless the performance or price were right.

  • @ilhambuono5451
    @ilhambuono5451 Год назад

    must be nice if we have 8 channels of ddr5 with 13900k or 7950x performance processors

  • @cinemaipswich4636
    @cinemaipswich4636 Год назад

    All those Intel Gamer CPU's are fine, but they only have 20 PCIe lanes.
    GPU=16 lanes, 4 NVMe's+16 lanes. 10GBeLan=4 lanes. Add on card 4 NVMe's=16 lanes. 4ch Blackmagic card=8 lanes. 8ch Audio card=4 lanes. Only Threadripper can deliver 128 lanes.

  • @jeffreylevin9728
    @jeffreylevin9728 Год назад

    I'd be curious to update this and instead of 13900K use the Xeon W-2495X cpu instead. That way compare apples to apples

  • @daveg4417
    @daveg4417 Год назад +2

    This video should have been the Intel Xeon W7-2495X versus the Intel i9-13900K... 24 vs 24 on Intel both with PCIe5 and DDR5. Threadripper is old tech now with only PCIe4 and DDR4 etc.
    P.S. The 2495X outperforms the 13900 on multi-thread tasks and has more PCIe lanes etc.

    • @Hansen999
      @Hansen999 Год назад +1

      Agree - I don't get the obsession of the old Threadripper, especially when you know a new version is around the corner.

    • @daveg4417
      @daveg4417 Год назад

      @@Hansen999 - I recently built an Intel Xeon W7-2495X, ASUS W790 ACE, 512GB DDR5-4800, RTX-3090 system. It literally kills my R9-5950X, ASUS ROG, 128GB DDR4, RTX-3080 system in multi-threaded tasks.
      I run mostly high-thread-count software with the systems, and I purchased the Xeon because I need the 512GB minimum, I'm waiting for 256GB RDIMMs to get popular so that I can bump it up to 2TB.
      I looked at the TR Pro currently available, but saw them as dead-ends with PCIe4 DDR4, plus the TR Pro 24-Core system I priced out in Canada was $20,000 CAD, compared to the Xeon at only $12k.
      I decided not to wait for the next gen TR because they will no doubt be even more expensive here. We pay a premium for hardware.

    • @RafitoOoO
      @RafitoOoO Год назад

      @@Hansen999 because Threadripper pretty much killed Intel's HEDT. People back then used to have x58, x79 and x99 even for gaming. Nowadays most people don't even know what socket they're using. Threadripper simply conquered all the mind share when it comes to productivity PC and that's why people are obssessed with it.

    • @Hansen999
      @Hansen999 Год назад

      @@RafitoOoO Very true but Sapphire Rapids has arrived and it dominates the current Threadripper even with less cores.

  • @testerlang6658
    @testerlang6658 Год назад

    Which is winner?

  • @engahmednofal
    @engahmednofal Год назад

    thanks

  • @DarbTails
    @DarbTails Год назад

    Wonder if the am5 platform will see threadripper soon.

  • @gabrielegelfofx
    @gabrielegelfofx Год назад +1

    The math is simple: 2XGpus 16x = 32 pci lanes, 4XNvme ssds 4X each = 16 lanes. We need a cpu with a minimum of 48 pci lanes to build a professional workstation. The alternative is to have a good chipset to switch very fast all the necessary lanes for the cpu.

    • @vinylSummer
      @vinylSummer Год назад +2

      Honestly GPUs are fine with 8 4.0 lines each, and man, 4 ssds? Isn't 2 more than enough? 24 pcie lanes is enough

    • @tomsun3159
      @tomsun3159 Год назад

      @@vinylSummer Honestly NO: a) system drive, b) source drive, c) target drive, d) footage drive, e)scratch drive the results after the project are than afterwartds in the sinkhole NAS not needing PCI-Lanes and the scratch drive can be a 4 NVME card (in raid) alone using 16 lanes so in sum 16 GPU1 , 4 NVME 1(a), 4 NVME 2(b), 4 NVME 3(c), 4 NVME4(d), 16 NVMEcard5(e) in sum 48 lanes (not calculating a reserve for e.g. another card using pci-e-lanes . AND NO switching lanes via chipset is NOT an option as it DIVIDES the BANDWIDTH. The question is for maximal CONCURRENT throughput, not sequential throughput. Of course a usecase normal user (me included) have no usecase for. For a private user a fast switched solution via chipset is fast enough, as they won't notice if there is a limitation of bandwith due to switching. In interactive use the user is always slower than the computer and is the bottleneck. But if you have system, source, target, footage and scratch on the same physical drive you will notice the limitation (even on 2 physical drives)

  • @Zangetsu_X2
    @Zangetsu_X2 Год назад

    teal deer version:
    what you are paying for are PCI lanes and memory channels.

  • @MrKooolAids
    @MrKooolAids Год назад

    Theoretically that means 5995WX is better cpu than 7800X3D for gaming since it has more L3 cache?

    • @theTechNotice
      @theTechNotice  Год назад +2

      We'll not really coz the single core performance is much lower.

  • @COddietsch
    @COddietsch Год назад

    I guess as a gamer, a Threadripper is an overkill. now I just need to know if installing 2) RTX 4090 24GB is overkill for a gaming rig.

    • @josephspruill1212
      @josephspruill1212 8 месяцев назад

      Unless you’re mining crypto, the answer is yes!

  • @ashtonlipscomb1295
    @ashtonlipscomb1295 Год назад

    the DaVinci resolve negative numbers are green instead of red

    • @theTechNotice
      @theTechNotice  Год назад

      Thanks for pointing that out. Made a correction in the pinned comment! ;) 👍🙏

  • @grtitann7425
    @grtitann7425 Год назад

    Well, I would buy AMD regardless.
    I remember how "nicely" Intel treated me when they were on top and I refuse to support a company that resorts to illegal actions to go ahead.

  • @dn7783
    @dn7783 Год назад

    I've never used keys on a windows For a long time since I've learned the ways of pirating.

  • @xjet
    @xjet Год назад

    What's wrong? That's a terrible choice of shirt color, given the shades of your background 🙂

  • @alltheboost5363
    @alltheboost5363 Год назад

    13900k boost clock is 5.8 not 5.6 and most will do 6 or 6.1 with a good mobo.

  • @mpeugeot
    @mpeugeot Год назад +1

    I paid $800 for a 2990wx in 2019, still pretty fast for what it is an will do 4 GHz all core easy.

  • @notsimar
    @notsimar Год назад

    the numbers on the geekbench and davinci resolve were wrong

    • @theTechNotice
      @theTechNotice  Год назад +1

      Thanks for pointing that out. Made a correction in the pinned comment! ;) 👍🙏

  • @najeebshah.
    @najeebshah. Год назад +2

    no doubt the intel cpu is much better, not to mention way cheaper

    • @mariuspuiu9555
      @mariuspuiu9555 Год назад +1

      for some of the workloads obviously. but when you buy a workstation CPU you generally do it because you need the extra CPU cores, memory bandwidth and/or extra PCI-e lanes. (and official support for ECC memory - this is very important for some)

    • @akyhne
      @akyhne Год назад +1

      For the average RUclipsr content creator, he couldn't care less about stability.
      For a content creation company though, ECC memory is critical. They would never in their right mind, choose an Intel consumer CPU, over any workstation platform, Intel or AMD.

    • @mariuspuiu9555
      @mariuspuiu9555 Год назад

      @@akyhne correct, mission critical systems will generally have ECC memory.
      as a side-note, you can go with AMD consumer CPUs since you can find ECC support on many motherboards (but you need to do proper research on what works and what doesn't).

    • @akyhne
      @akyhne Год назад

      @@mariuspuiu9555 Yeah, I actually just bought an AMD Ryzen 9 7900 platform, with 8 gigs of ram, just to get started.
      But im concidering going for 64GB of ECC. I would probably just go for a ram module, listed on Asus website, for the MB.
      There are a few things, that I don't know. Does ECC support dual channel mode?
      And what the heck does this in my manual mean?
      "Non-ECC, Un-buffered DDR5 Memory supports On-Die ECC function."

  • @JISD27
    @JISD27 Год назад

    7:54 mistake

  • @jimmying8172
    @jimmying8172 Год назад

    7:53 Geekbench single scores are the same

    • @theTechNotice
      @theTechNotice  Год назад +1

      Thanks for pointing that out. Made a correction in the pinned comment! ;) 👍🙏

  • @ripiosuelto
    @ripiosuelto Год назад

    👏👏👏👏👏👏

  • @BloodiTearz
    @BloodiTearz Год назад

    9x the price for double performance in some apps, hmmm

  •  Год назад

    3 times cheaper = Price - 3 x Price= - 2 x Price
    Please don't use this phrase.

  • @ashoksedhain1714
    @ashoksedhain1714 Год назад

    Geekbench 5 and DaVinci Resolve hava data mistake

  • @tswiftsbf
    @tswiftsbf Год назад

    this is kind of an unfair comparison, intel hasn't released their next gen xeon chips yet

    • @akyhne
      @akyhne 5 месяцев назад

      How is ut unfair? He's not comparing it to any old Xeon and if he was, it would be fair comparing what's on the market.

  • @orion_world9661
    @orion_world9661 Год назад

    RTX A5000 24 GB & A6000 48 GB Review Specially on blender........

  • @As-zc3kt
    @As-zc3kt Год назад

    I9-13900k has max.freq 5.8 GHz

  • @NemishKr
    @NemishKr Год назад

    should have compared 3 i9 with one threadripper 😂

  • @oMeGa0122
    @oMeGa0122 8 месяцев назад

    Love the content but seriously the charts sucks. It took me a while to understand it

  • @John_kyle5879
    @John_kyle5879 Год назад

    ✌✌

  • @truth_hearts_1940
    @truth_hearts_1940 Год назад

    soo....why are you being dishonest as to compare $1850 cpu against consumer $570 cpu? is it because if you grab comparable xeon processor it blows your amd out of the water?

  • @edynio1
    @edynio1 3 месяца назад

    With the amount of crap news coming from intel processors i guess now there is no more doubt about what to buy. 😂😂🤣

  • @HyperionZero
    @HyperionZero Год назад

    13900k is 5.8GHz not 5.6
    and it also doesn't pull 340w. mine never pulled more than 250

  • @itaiperes9635
    @itaiperes9635 Год назад +2

    X10 the price for X2 the performance.... AMD loose it.... RIP AMD

    • @vinylSummer
      @vinylSummer Год назад

      😂😂😂

    • @akyhne
      @akyhne Год назад +1

      No! It depends on who you are. A company would never choose the consumer CPUs, no matter the performance difference or price.
      And AMD smokes Intel, when it cones to workstation CPUs.

  • @nerolowell2320
    @nerolowell2320 Год назад +1

    conclusion: Threadripper is bad option for gaming !

  • @dvogiatzis
    @dvogiatzis Год назад +2

    Second!

  • @dankodnevic3222
    @dankodnevic3222 Год назад

    Green negative scores... BTW, most results are, actually, advertisement for 13900K!

  • @ceuser3555
    @ceuser3555 Год назад +1

    13900k has fake cores so it’s not even a fair comparison 😂 intel wants to make you believe that 24 cores beats the 16 true cores of 7950x when they both have the same number of threads. Don’t be deceived by the e core intel nonsense.

  • @braveheart0317
    @braveheart0317 Год назад +1

    Wtf

  • @thom1218
    @thom1218 Год назад

    This is not a fair comparison, 5965WX has 48 threads, 13900k has 32... Try comparing 5955wx (16C/32T) with the 13900k instead because 13900k has 16 single threaded cores (e-cores) and 8 hypertheaded (2T a piece) performance cores.

  • @gh0sted-m3dia
    @gh0sted-m3dia Год назад +1

    third

  • @Lucatoni22
    @Lucatoni22 Год назад

    Guys become an Intel shill sadly

  • @ThatFinchGuy
    @ThatFinchGuy Год назад +16

    Intel is always better on any CPU compared to any Intel Is best no matter the AMD fanboys said

    • @akyhne
      @akyhne Год назад +31

      AMD kills Intel, in the workstation and server market.

    • @lelouchabrilvelda1794
      @lelouchabrilvelda1794 Год назад +16

      ​@@akyhneAnd Console, MINI PC, Gaming cpu and 3d cpu.
      Nuc intel are dead because of AMD IGPU.

    • @Smith555
      @Smith555 Год назад +7

      @@akyhne AMD has less then 20% of the server market share. Q1-2023 AMD server market share was 18%. Their desktop share was 19.2%. Their mobile unit share (notebooks/mobile sector) was 16%. Not sure what they're "killing". Their GPU market share was just lost to Intel in the last month or two (and they've been making video cards for less then two years), with Nvidia dominating that at around 82%~.
      The only thing they have going for them is console's, and that's only because there are no modern intel powered consoles.

    • @akyhne
      @akyhne Год назад +8

      @@Smith555 I wasn't refering to market share, but their products. AMD CPUs beats Intel hands down!
      And since you know so much about market share in the server market, I'm sure you also know why Intel still leads.
      It's all about adapting to a new platform - rewriting code etc.
      If you look at the top ten list of supercomputers, AMD holds 1st, 3rd, 8th, and 9th place. Intel holds 10th place.
      That would have been unheard of, just 5 years ago.
      Supercomputers are also planned many years in advance, which is why AMD will probably overtake Intel, on the server market, within a few years.
      AMD server CPUs are more power efficient, and just plain faster, holding 300 world records, with their Epyc CPUs.

    • @Teluric2
      @Teluric2 Год назад +4

      ​@@akyhnein the top500 supercomputers intel has 90% supremacy

  • @oscarcampbellhobson
    @oscarcampbellhobson Год назад +2

    First

  • @alderlake12th
    @alderlake12th Год назад

    I9 still kills 7950x and partily 5965wx

  • @Adam01Time
    @Adam01Time Год назад

    Time to block your channel

  • @chriswright8074
    @chriswright8074 Год назад +1

    13900k isnt a true 24 core CPU

    • @determination9296
      @determination9296 Год назад +1

      The e cores are weaker but they are still CORES. They arent AI Generated.

    • @theTechNotice
      @theTechNotice  Год назад +3

      What do you mean by that? @chriswright8074

    • @KPAki1Ler
      @KPAki1Ler Год назад

      I think if it had 24 p-cores?

    • @theTechNotice
      @theTechNotice  Год назад +2

      How come e-cores aren't real cores?

    • @chriswright8074
      @chriswright8074 Год назад +1

      @@theTechNotice because it's basically a hybrid CPU more like arm and mobile cpus in Intel case their e cores aren't full p cores but more so offloading I'm not saying they aren't helpful but they don't have the pipeline for bigger task..I don't count them as full fat cores because they aren't they more like atom cores Intel has a problem with their design the power they draw that's why they went with this design to help increase performance without a mass power draw if they made 16 full cores with ht

  • @tylerlinder9368
    @tylerlinder9368 Год назад +1

    whokeys is a scam