NVIDIA RTX 4090 PCIe 3.0 vs. 4.0 x16 & '5.0' Scaling Benchmarks

Поделиться
HTML-код
  • Опубликовано: 16 янв 2025

Комментарии • 1,4 тыс.

  • @GamersNexus
    @GamersNexus  2 года назад +783

    One correction of an off-hand remark at 4:25 - the RX 6500 XT has 4 PCIe lanes (Gen4), not 8. We forgot how much of a piece of trash that card was, sorry. Our mistake for not double-checking.
    Watch our video about the melting RTX 4090 12VHPWR cables here: ruclips.net/video/EIKjZ1djp8c/видео.html
    The best way to support our work is through our store: store.gamersnexus.net/
    Like our content? Please consider becoming our Patron to support us: www.patreon.com/gamersnexus

    • @legostarwarsrulez
      @legostarwarsrulez 2 года назад +6

      If it had 16 lanes I would’ve bought a couple and used them in some low end budget builds for friends.

    • @wnxdafriz
      @wnxdafriz 2 года назад +9

      @@legostarwarsrulez it doesn't have hardware decoding support... or was it encoding?? or was it both?? ... sigh it was so bad i can't even remember how bad it was or if i'm making it even worse (if thats possible)

    • @Flooberjobby
      @Flooberjobby 2 года назад +3

      I know this sounds like it'd be trivial and pointless. But does PCI-E performance differ on platform? Such as AMD vs Intel? I know it seems like a no brainer they shouldn't differ. But never know unless it's tested.

    • @MLWJ1993
      @MLWJ1993 2 года назад +5

      @@wnxdafriz it can do hardware decoding, but not encoding so things like Relive don't actually work (even though the "feature" was still selectable in the software, at least when it released)...

    • @Ketobbey
      @Ketobbey 2 года назад +1

      THANK YOU! I was waiting for this test to be done by you guys! Thank you, Thank you!

  • @hquest
    @hquest 2 года назад +1426

    Bottom line: Vaseline does not make bits flow smoother, and PCIe 4.0 is more than adequate for today’s applications.

    • @ChrisM541
      @ChrisM541 2 года назад +158

      So is PCIe 3 !!!!
      - You'll see, for the vast majority of games, little/no difference from PCIe 3 all the way up to PCIe 5

    • @Senarak
      @Senarak 2 года назад +77

      @@ChrisM541 Very true, I am honestly surprised at how well Gen3 keeps up with high framerates at 4K. It just goes to show how little bandwidth graphics cards need

    • @IRefuseToUseThisStupidFeature
      @IRefuseToUseThisStupidFeature 2 года назад +27

      I am staggered that something claiming twice the speed (3-4) makes almost no difference, even on the 4090.
      I feel like the current hardware tests don't accurately show performance scaling (and each GPU generation seems better than they are)

    • @GamesFromSpace
      @GamesFromSpace 2 года назад +62

      Good devs minimize the amount of data actually transfering to the video card, because we're not idiots. It's *always* a bottleneck, making it less bad doesn't make it good.

    • @TheSkiddywinks
      @TheSkiddywinks 2 года назад +27

      > "...PCIe 4.0 is more than adequate for today’s applications."
      Relative to PCIe 3.0 for sure. But since the card does not support any higher (i.e. PCIe 5.0) it is impossible to say it wouldn't benefit. For sure I would argue it likely doesn't, but in the interests of accuracy, I thought it worth mentioning that no conclusion can be drawn there.

  • @МальвинаКотик-л1ъ
    @МальвинаКотик-л1ъ 2 года назад +967

    4:24 as a proud 6500XT owner i must point out that it has 4 PCI-E lanes, not 8

    • @GamersNexus
      @GamersNexus  2 года назад +470

      Thanks for that. That's an error, yes.

    • @budgetking2591
      @budgetking2591 2 года назад +87

      exactly, thats its biggest flaw, would of been a decent gpu if it had 8 lanes so there would be no bottleneck for people on pci 3.0.

    • @Lord_Reset
      @Lord_Reset 2 года назад +162

      It's like a raccoon defending it's pile of garbage /s

    • @jaronmarles941
      @jaronmarles941 2 года назад +20

      ​@@budgetking2591 The 5500XT had 8 lanes, and still got bottlenecked on 4 gigs on double the bus size. That card was doomed from the start, and 4 more lanes wouldn't have saved it.

    • @youkofoxy
      @youkofoxy 2 года назад +15

      As a owner of a RX 6600 because I was really unhappy with buying a RX 6500 XT or a RX 6400 I can confirm, it's 4 lanes PCI-E gen 4 with memory bandwidth at 64 bits.

  • @krakentortoise7531
    @krakentortoise7531 2 года назад +89

    Wow this came at a good time! I was just searching about the effects between these. Thanks Steve!

    • @TwistedEyes12
      @TwistedEyes12 2 года назад +3

      @Gamers Nexus shorts 🅥 This spammer is actually the FieRcE channel on RUclips, going by "thanks" or Gamers Nexus shorts. Don't click it unless you want to give them a free view, very sad way to try to do this. I wouldn't have even cared if you just didn't try to copy and pretend to be Gamers Nexus including using their logo to trick people.

    • @IRefuseToUseThisStupidFeature
      @IRefuseToUseThisStupidFeature 2 года назад +1

      FFS spam

  • @juansarmiento6754
    @juansarmiento6754 2 года назад +235

    This was really useful for some of the guys who are on am4 with some older motherboards and upgraded to Ryzen 5000. Making a newer GPU purchase will still extend the life of these systems even more.

    • @christophermullins7163
      @christophermullins7163 2 года назад +13

      I just bought a b450 and 5600 6months ago. It's plenty for most any GPU in 2022.

    • @mastersmurfify
      @mastersmurfify 2 года назад +18

      Im in that boat - just popped in a 5600X and now waiting for a 4070 or RDNA3 GPU

    • @TheNerd
      @TheNerd 2 года назад

      Wait isnt the minimum spec for Ryzen 5000 a X470 Chipset? X370 seems a little old, since these boards also were not made for the amount of watts that 5900X and 5950X can pull.

    • @HB-622A
      @HB-622A 2 года назад +13

      @@TheNerd Some X370 boards did get BIOS updates to support 5000 series. In fact, even as low as A320 have some support, but I definitely wouldn't run an R9 on those.

    • @juansarmiento6754
      @juansarmiento6754 2 года назад +19

      @@TheNerd I have an Aorus x370 with a 5800X3D. Most major motherboard manufacturers updated bios to support 5000.

  • @Velocirabbit53
    @Velocirabbit53 2 года назад +22

    This is incredibly useful information to determine whether or not a platform upgrade is necessary if I want a new GPU. The 9900 is still running like a charm and it looks like it will continue to do so. Thanks y’all!

  • @bencunningham2611
    @bencunningham2611 2 года назад +40

    Thank you so much for this testing. I am someone who has a 4090FE in a PCIE3.0 slot, mostly as I was GPU bound running 2k ultrawide. Now I can get my monitors 175hz again, even with DLDSR thrown into the mix, and my 9900k at 5.2 can get another generations worth of use before I upgrade it. :)

    • @T.K.Wellington1996
      @T.K.Wellington1996 Год назад +13

      I have a 4090 and a 5800X3D on a x370 AM4 board with PCIe 3.0, and 16GB DDR4-3200 RAM. I could not care less about PCIe 5.0 and DDR 5. It does not matter.

    • @De-M-oN
      @De-M-oN 11 месяцев назад +2

      @@T.K.Wellington1996 It does matter. DDR5 improves performance

    • @T.K.Wellington1996
      @T.K.Wellington1996 11 месяцев назад +6

      @@De-M-oN Therefore I have the 96MB L3 3DV-Cache auf dem 5800X3D as compensation.

    • @H0tCh0c0-gr3qe
      @H0tCh0c0-gr3qe 2 месяца назад

      whats your cpu now

  • @alsiniz
    @alsiniz 7 месяцев назад +5

    Thank you Steve and GN team! I just remembered that my b350 motherboard from 2017 is still on PCIe 3.0 with a 4070 Super on the way. This video alleviated my anxiety around ordering a GPU a bit too early.

  • @Devilman666
    @Devilman666 2 года назад +524

    I think the benefits of pcie gen 5 is more about being able to use less lanes for equal bandwidth. Gpu's who use 16x slots shouldn't see a real difference.

    • @GamersNexus
      @GamersNexus  2 года назад +188

      Agreed

    • @ffwast
      @ffwast 2 года назад +42

      The question is "why build a GPU that doesn't use all 16 lanes to begin with?" What's the point? Hobbling the performance with older boards?

    • @reaperreaper5098
      @reaperreaper5098 2 года назад +36

      That was true of PCIe 3.0 when it first came out, and is still true of 4.0.
      But mainboard manufacturers have their heads so far up their asses that they refuse to offer anything beyond a pathetic x8/x8/x4 allocation, and even that's not overly common and often requires forgoing some other feature or port (or set of ports) when it's not really necessary.

    • @dominikobora5385
      @dominikobora5385 2 года назад +11

      @@ffwast cost plus somethings such ad nvme ssds take up pcie lanes so if even if the gpu has less lanes it has the same bandwidth and other devices can use those lanes

    • @budgetking2591
      @budgetking2591 2 года назад +7

      @@ffwast because there is many gpu's that dont need more then 8 lanes on pci 4.0, also that way you dont lose performance when you use 2 nvme drives because only 8 pci lanes will be left on a lot boards. im glad my rx6600xt only uses 8 lanes for its max potential, or else i would of lost even more performance, 8 pci lanes is all that is left for the gpu on b450.

  • @masterww8
    @masterww8 2 года назад +14

    Thank you for doing this. It really helps make better informed decisions!

  • @Netsuko
    @Netsuko 2 года назад +370

    I literally just asked myself if my PCIe 3.0 board would be a bottleneck since I also have two NVMe SSDs. And here's GN's video. Perfect timing, as always!

    • @PolskiJaszczomb
      @PolskiJaszczomb 2 года назад +27

      Well, it's not the bandwidth that's gonna be a bottleneck :D

    • @ffwast
      @ffwast 2 года назад +12

      If direct storage starts making an actual difference it might come up.

    • @GamersNexus
      @GamersNexus  2 года назад +184

      The CPU on the board might be a limit, though. Keep in mind also that lane count may get thin with multiple devices, depending on how the board assigns lanes and on which CPU you have.

    • @smiIingman
      @smiIingman 2 года назад +11

      For real ive been wondering aswell.
      Ive a 10700k system so PCIE 3.0 and got both nvme ssd slots occupied AND i got a 4090 recently and i can definitely feel the massive bottleneck the 10700k puts on the 4090.
      Planning on upgrading when black friday hits though.

    • @dcan4
      @dcan4 2 года назад +19

      If your on a x470 or b450 with a Ryzen 5000 processor, you'll be fine.

  • @WolvenSwiftwind
    @WolvenSwiftwind 2 года назад +4

    I'm so happy you made this. I am temporarily using my 4090 suprim on a pcie 3.0

  • @MagnumMatt09
    @MagnumMatt09 2 года назад +13

    dang that PCIE test footage nukes youtube's bandwidth

  • @KyleKunze
    @KyleKunze 2 года назад +9

    Thank you. This helps me better plan to use some older hardware more effectively and for longer. Appreciate the straight forward and informative piece.

  • @Kiyuja
    @Kiyuja 2 года назад +3

    oh wow I thought this would have mattered more, especially in the times of ReBar and whatnot. Thanks for testing it!

  • @rbtoj
    @rbtoj 2 года назад +38

    6:18 In my experience, playing GTA V with the grass setting set to the absolute maximum creates a lot of PCIe traffic. Especially in the area near the "Vinewood" hills. Back when PCIe 3.0 was still recent-ish, I remember taking a huge hit in performance when using PCIe 2.0 around this area.
    If I'm not mistaken, HWmonitor has sensor for measuring the PCIe traffic, I believe it's called GPU IO or something.
    Any other game or game area with a high number of draw calls should spike PCIe bandwidth usage too.
    EDIT: From memory, 3DMark used to have a draw call benchmark and it showed a very clear performance scaling between the PCIe generations.

    • @Focus_Fearless
      @Focus_Fearless 2 года назад +6

      There is another person in the comments who experienced similar results with grass density, except different game.

    • @beninua
      @beninua Год назад

      Maybe I'm asking out of ignorance, but why DX11 and not DX12 or Vulcan. I run all games that can on Vulcan or at least DX12 on my rig (ASUS z790-E, Intel Core-i9-13000K, 128GB DDR5-6000 RAM, nVidia RTX4090-24GB), and I can run all games with all settings at ULTRA with FPS never dropping below 100 on my 49" Samsung G9 NEO SuperUltraWide 5120x1440px (aka 32:9) GSync (240hz) monitor. I love your mentioning of the final death of SLI. Man the headaches I had finding motherboards with enough room for 3 nvidia 1080gtx in tripple SLI or just running two 2080Ti OC in SLI and a Motherboard running two times genuine PCIe x16 with two GPUs mounted in SLI, not to mention the extreme cooling needed with water pipes running everywhere. IT was a pest! I love plugging in just one mf.....ing brick of a 440mm long RTX4090 and get ten times the performance. And literally no noise at all from any fan.
      I do have one question though. I do have 8 SSD in my rig. four of them are normal SATA 8TB Samsung 870QVO, but the other four are identical Corsair MP600 PRO NH PCIe 4.0 NVMe M.2 - 8TB. Should it be a problem that I've used all four NVMe slots on the motherboard when they are all gen4 capable?
      I actually thought the RTX4090 Card was the first to take advantage of PCIe gen5 so I set the PCIe x16 slot to gen5 in BIOS. Could thet be the reason I constantly get a startup BIOS error saying that BIOS slot x16 has downgraded to PCI GEN3 (Which is default if I load BIOS default settings) and that I should press F1 if I want to change BIOS settings manually? Should I simply try to set my PCIe x16 to GEN4 in BIOS to avoid the fallback to GEN3?

  • @CheesusSweet
    @CheesusSweet 2 года назад +3

    I'm fairly new to your vids but I love how thorough and articulate you are. Thanks for all the work you guys do!

  • @legendaryx590motherboard6
    @legendaryx590motherboard6 2 года назад +209

    When things get extreme, Vaseline.

    • @Mynx31
      @Mynx31 2 года назад +10

      It raises some questions

    • @JorgeMartinez-dp3im
      @JorgeMartinez-dp3im 2 года назад +22

      We still talkin' bout graphics cards? 😂

    • @legendaryx590motherboard6
      @legendaryx590motherboard6 2 года назад +2

      Jk support the GN store!

    • @phillylove7290
      @phillylove7290 2 года назад +10

      Like my uncle always said

    • @cyko5950
      @cyko5950 2 года назад +5

      @@JorgeMartinez-dp3im it's only a matter of time before overclockers start using lube

  • @emike09
    @emike09 2 года назад +1

    I was super curious about this, thanks for covering it!

  • @HB-622A
    @HB-622A 2 года назад +29

    One thing I think is interesting about the Warhammer 1080p results is that the lows are basically identical despite the average being lower.

    • @jacobparis6576
      @jacobparis6576 2 года назад +2

      It certainly makes it look like a driver or software issue, since that's the behaviour I'd expect from a frame limit being applied.

    • @6St6Jimmy6
      @6St6Jimmy6 2 года назад +4

      @@jacobparis6576 That's how PCie bottleneck can also starts to show. Lows don't get too much effect. Highs get much lower, so average lowers. It's more about when in the benchmark the actual PCIe bottleneck happens.

  • @celcius_87
    @celcius_87 2 года назад +1

    Thank you for testing this because I’ve been wondering about this

  • @Isamu27298
    @Isamu27298 2 года назад +12

    You guys keep on pumping out content. Really gratefull for everything you do. So much to learn from your videos!

  • @madarasoun3018
    @madarasoun3018 2 года назад +1

    Thank you for this video. I have been waiting for it.

  • @CyberJedi1
    @CyberJedi1 2 года назад +233

    I wonder if Resizable Bar is making a bigger difference now with the 4090 than with 3000 series. Could be an interesting video Steve.

    • @GamersNexus
      @GamersNexus  2 года назад +226

      Banned the spam account. As for ReBAR - good question. We test with ReBAR on, so it might.

    • @TwistedEyes12
      @TwistedEyes12 2 года назад +20

      @@GamersNexus Glad you caught them quick, appreciate everything you do! Cheers

    • @exk4
      @exk4 2 года назад +5

      Strikes me as a good question relevant to this one, if you're on an older platform limited to PCIe 3 there's a good chance you won't have ReBar either.

    • @Holyspiritrecieved
      @Holyspiritrecieved 2 года назад

      Nvidia a cinderblock gpu.. runs at much higher Temps at cost of longevity

    • @ScottGrammer
      @ScottGrammer 2 года назад +2

      @@GamersNexus Do you suppose that NVidia will eventually release a driver that turns on resizable bar on older GPU's, like my 2080TI? I am willing to bet that the hardware supports it, it's just turned off.

  • @EgaoKage
    @EgaoKage Год назад +2

    You guys rock! This was near enough exactly the info I was looking for. I have an older PC which I use as a secondary machine (formerly my main PC) that's all PCIe Gen 3.0, which currently has a pair (SLI) of GeForce 950s in it. I don't remember why I went with that build specifically, but I remember there were reasons. Anyway, now that prices on last-gen GPUs have come down a bit, I've been thinking about giving it an upgrade. The options I've been considering are between a 2080 Ti (Gen 3.0) and a 6900 XT (Gen 4.0). But I wasn't sure how much the PCIe Gen of the MB would negate any advantages the 6900 XT had over the 2080 Ti. Thanks for this info!

  • @AndroidPoetry
    @AndroidPoetry 2 года назад +10

    Thank you for including Warhammer 3, lots of places don't do RTS/4x titles, which is one of the main reasons to do PC gaming.

  • @aidanpjpost
    @aidanpjpost 2 года назад +26

    Far out Steve, you and the team have been spitting out great in-depth pieces CONSTANTLY for over a month now. Please take a break for your own sanity!

  • @largote
    @largote 2 года назад +40

    PCIe 2.0 (or 3.0 x8) tests would also be interesting here since a lot of motherboards go down to 8x when multiple M.2 drives are installed. (X470, in particular, does this and plenty of people slotted 5800X3D chips onto those and X370 boards).

    • @MegaMoonse
      @MegaMoonse 2 года назад +5

      Watched the video for this answer. Steve, if you are reading please do one test.

    • @kaanozkuscu5079
      @kaanozkuscu5079 2 года назад +4

      @@MegaMoonse people who get 4090s have the money to buy a proper pro mainboard.
      why care for "benchmarks" if you wont ever have the hardware?

    • @MegaMoonse
      @MegaMoonse 2 года назад +9

      @@kaanozkuscu5079 the main question is how much pci-e speed matters? The gpu is not that important. It could be 4090 or future 4050. If gen 3 x8 is enough or performance hit is relatively small I would certainly keep my motherboard.

    • @cristiansaucedo7893
      @cristiansaucedo7893 2 года назад +1

      @@MegaMoonse theyve already done a video on this topic, testing a 3080 on pcie gen 2 3 and 4. short answer yes, you lose a decent chunk of performance on gen 3 x8 (equivalent to gen 2 x16). if youve got 30 series its worth it to give it the most possible bandwidth.

    • @mytech6779
      @mytech6779 Год назад

      gen 2.0 chipsets provided many more lanes, My AM3+ has 40+4 PCIe lanes and the board provides x16/x16 or x8/x8/x16 or 4 x8 slots plus two x4. (They are all from the chipset, that generation of CPU did not manage the PCIe on-die).
      Storage drives and all onboard I/O was handled by the southbridge (which is connected with the "+4" lanes)
      There was also a critical change in the protocol between v2 and v3, there is another major change between v5 and v6 because with 5 and 6 they are getting into frequencies that cause major circuit engineering problems and high error rates in transmission.
      The original proposed v5 standard was 1.3x v4 not double v4, because there was no proof that the full speed increase could be feasibly obtained in consumer commodity products.

  • @TybJim
    @TybJim Год назад +1

    Thanks for making this video. I had a suspicion that PCI-Express 3.0 would be enough for modern cards and your video and my other research leads me to believe it will be fine. I have an older PCI-Express motherboard (Asus Rampage V Extreme) in my PC that I built back in 2015. After recently upgrading the CPU to a second-hand 6950X, adding more RAM and M2 drive I'm now looking at new graphics cards. I've been out of the loop for a while, so your comment about SLI being dead was helpful too.

  • @marekkovac7058
    @marekkovac7058 2 года назад +8

    This is a good news for 2x 4090 in ryzen 7000 series motherboard that supports PCIE 5.0 x8 / x8. I was hoping you'd test some rendering / simulations. Thanks for the gaming benchmarks, it tells a lot too! :)

    • @rdiricco
      @rdiricco 2 года назад

      Are you sure that's what you are going to get? I have a 12900k in an Asus MB and even though I can do Gen5@8x the card only supports Gen4@8x. So if you cut the lanes to 8x (I'm using the other 8x for M.2 drives) you only get the Gen4 speeds so only 16GB/sec not 32GB/sec.
      If your seeing something else let me know I'd be interested in how you got there.

    • @mikeramos91
      @mikeramos91 2 года назад

      @@rdiricco I recently upgraded & see my card running at 4.0x8. It’s technically the same as 3.0x16. Will there be a difference?

  • @bleedinggumsmurphy537
    @bleedinggumsmurphy537 2 года назад +2

    i was just debating upgrading from gen 3 to gen4/5 with a full mobo and cpu upgrade. thanks for saving me some money for the time being.

  • @EVPointMaster
    @EVPointMaster 2 года назад +37

    I noticed that the game "Flower" is actually extremely bandwidth sensitiv and also that the grass density setting impacts this a lot.
    I was curious so I tested with different pcie bandwidths. I used an RTX 3080.
    The frame rate scales almost linearly with the bandwidth!
    1.0 = 36.4fps
    2.0 = 69.9fps
    3.0 = 127.7fps
    4.0 = 222.4fps
    It's possible that the bottleneck shifts at 5.0, but even then it could be interesting to test for cards that are only x8.

    • @adjoho1
      @adjoho1 2 года назад +2

      Ffs more spam.

    • @EVPointMaster
      @EVPointMaster 2 года назад +3

      @@adjoho1 I always report these comments, but I'm not sure if RUclips actually does anything about it.

    • @GraveUypo
      @GraveUypo Год назад

      @@EVPointMaster they don't

  • @lennieb7367
    @lennieb7367 Год назад +1

    Thank you very much! Thanks to your testing, I bought a PCI 4 motherboard for ~$200 instead of paying almost ~$600 for a PCI 5 motherboard.

  • @LunarwolfGaming
    @LunarwolfGaming 2 месяца назад

    Thanks for the video. I am getting into video editing and am SERIOUSLY needing an upgrade to my graphics card, but wasn't sure how my motherboard would match up with the new cards. Turns out I'm just a dummy because my MOBO has a PCIe 4 x 16 slot and I thought it only had Gen 3. Even so, I can see upgrading my graphics card would've been a win anyway. Thanks again, love the channel and the no-nonsense data and presentation.

  • @jeffsaffron5647
    @jeffsaffron5647 2 года назад +3

    Great video, always nice to test this when new cards come out.
    I personally use DeckLink 8K (8x PCIe 3.0), and it does not work reliably over shared chipset lanes at all. They need to be on CPU lanes. So unless one is on Threadripper or something with bunch of CPU lanes it is good to know we can run in 8x/8x split mode on cheaper platforms with no real performance hit.

    • @inkprod
      @inkprod 2 года назад

      I don't think it's neccesarily the chipset lanes, but more the amount of them. Asus has an x570 x8/x8/x8 board with the 3rd x8 connected to the chipset. The chipset itself is only connected pcie4.0x4 to the CPU, but that still leaves enough bandwidth to throw 4 12G feeds at the Decklink.
      We have a bunch of them with a mixture of quad2's and 8K's in the chipset slot without any issues.

    • @jeffsaffron5647
      @jeffsaffron5647 2 года назад

      @@inkprod yea It may work on chipset lanes just fine, thing is dedicated CPU lanes are certainty. Back in PCIe 2.0 days I had Decklink with single HD-SDI strugle on chipset lanes. It might work on chipset lanes or it might not.

  • @studlyruddly16
    @studlyruddly16 2 года назад +1

    This test 100% eased my concern about buying a pcie gen4 board for my new r7 7700x pc

  • @thrilhouse007
    @thrilhouse007 2 года назад +5

    "Joe Rules, Steve Drools."
    How long has that sign been there in the background?

  • @dgraves14
    @dgraves14 2 года назад +9

    PCIe 3.0 x8 would have been interesting to see. While most people buying 4090s are unlikely to still be using PCIe 3.0 motherboards, it'd be interesting and useful to know just in case.

    • @Mom19
      @Mom19 2 года назад +2

      Well its the same as gen 2 x16 so if someone tests that scenario at some point, you can go off that.

    • @12Burton24
      @12Burton24 2 года назад

      PCI E 3.0 x 8 can already be overloaded by i think it was 1080ti.

    • @mrsuperselenio5694
      @mrsuperselenio5694 2 года назад +1

      @@12Burton24 Not really, there are test done that show that PCI E 2.0 x16/ 3 x8 has a loss of 3 to 10 fps tops, which honestly is within margin of error.

  • @chincemagnet
    @chincemagnet 2 года назад +2

    Sweet! I was hoping to see this from a reputable channel 👍🏼😁

  • @edwardallenthree
    @edwardallenthree 2 года назад +17

    Always interesting. I still run my GPU, a 3070 at 3.0x8 because I use the other x8 for an infiniband card. The performance hit is about 5%.

    • @МальвинаКотик-л1ъ
      @МальвинаКотик-л1ъ 2 года назад +1

      Very interested on why do you need an infiniband card. Isn't that a server thing?

    • @edwardallenthree
      @edwardallenthree 2 года назад +10

      ​​@@МальвинаКотик-л1ъ it is indeed. While the windows drivers require a special version of Windows pro and Windows Server, the Linux drivers are free and old infiniband gear is dirt cheap. I use it as the backbone for my NAS. This leaves me in the ridiculous position of running my Windows machine in a virtual machine where it's basically the only thing on the device so that I can use Linux drivers for my storage. Realistically the speeds are about equivalent to what you get with 10 gigabit Ethernet, but far lower CPU overhead if you do it right. And when you can get a 8-port switch, nominally 40gbs per port, for $89? It's a great way to get performance cheap.

    • @budgetking2591
      @budgetking2591 2 года назад +1

      I also have to run my gpu at 8x pci, because both nvme slots are occupied, so im actually glad the rx6600xt only uses 8 lanes.

    • @marekkovac7058
      @marekkovac7058 2 года назад

      @@edwardallenthree imteresting.. what kind of files system are you using? is cpu offload due to RDMA ?

    • @edwardallenthree
      @edwardallenthree 2 года назад

      @@marekkovac7058 ZFS on the server, but NFS over RDMA to share it. Network performance exceeds performance of the array, significantly, even with NVME caching.

  • @hugglesthemerciless
    @hugglesthemerciless 9 месяцев назад

    I appreciate y'all doing this test. I'm still on a ryzen 2700X with pcie3 and will be buying a 4090. Good to know that I'm not gonna be bottlenecked by that and don't need to upgrade the whole PC immediately

  • @191desperado
    @191desperado 2 года назад +3

    THIS is what I’ve been waiting for! THANKS!

  • @wedgoku
    @wedgoku 2 года назад +7

    Thankyou for testing, I game at 4k resolution & I'm currently using an Intel core i5 10600k which is limited to only PCI express 3.0 speed, for gaming it looks like there is no reason for me too upgrade my CPU then.

  • @iain.collins
    @iain.collins 2 месяца назад

    Really appreciate this testing and insight! There are still a few things that an RTX 4090 can't quite handle - in my case specific to some PCVR titles at higher resolutions and when relying on super sampling at 4K to address cosmetic issues with games that don't support DLSS. This video was helpful as I was specifically wondering how much the PCIe 3.0 on my older generation motherboard might be a limiting factor and what options (if any) there are for boosting performance a bit more. Thanks again for another great video!

  • @PublicBurrito
    @PublicBurrito 2 года назад +3

    I'm happy to see these results. I'm on a z590 and was going to be pretty bummed if the performance was drastically different.

  • @catch2030
    @catch2030 2 года назад

    Thank goodness you did this video. The amount of Reddit posts that say "you are bottlenecking your GPU by running on Gen3" is absolutely absurd. Then when your old video gets linked its "ewww thats old and not relevant today".... My favorite is the "you have the card in a X8 gen4 slot and killing the performance of the card."
    Anyhow what I am really excited to see is how PCIE Gen5 trickles to storage. For consumer parts they should be able to have the drives be X2 (same speeds as Gen4 x4) and save additional lanes for more USB connectivity or additional M.2 slots. Great video as always, now to figure out how to convince the wife GN Store is just a bill... lol

  • @Operational117
    @Operational117 2 года назад +4

    I believe the reason 1080p might benefit from upgrading from PCIe 3.0 to PCIe 4.0 is because 1080p nowadays is associated with higher framerates, and higher framerates means the CPU needs to send more commands to the GPU per second (and perhaps data as well).
    It's just a theory, though.

  • @MrPontus
    @MrPontus 2 года назад +1

    thankyou for this video, I have a 10980XE and not planing to change platform for a few years, so I will run my next graphics card at Gen 3 x16

  • @Descenter1976
    @Descenter1976 Год назад +3

    So when they tell me i need PCIe 4 for a 4070 ti, and i only have PCIe 3, I dont need to go out and buy a new Motherboard, in fact i will hardly notice the differance!?

  • @mpr746
    @mpr746 2 месяца назад

    A massive thank you for that video! I've just upgraded from my R7 2700 for a 5700x and was wondering if my B450 was bottlenecking my performance. I'm glad to know that I dont need to worry about that!

  • @jolness1
    @jolness1 2 года назад +11

    I’m surprised that 3.0 isn’t bandwidth limiting still since this is years into cards with 4.0. Makes me feel better about holding on to my x370 and running a 5800X3D with a 4090
    I do think this will continue for awhile, pcie4 5 etc are useful in data centers (and empirically, but maybe not practically, nvme)

    • @jolness1
      @jolness1 2 года назад

      @pauldenton1230GN does really good and thorough testing, I trust it. I didn’t want to buy a new board for an EOL socket and my board has plenty strong of a vrm. How are you like the 4090? I’m very impressed with the FE

    • @kecimalah
      @kecimalah 2 года назад

      I have x370 with 5800x3d and 4090 and going to change board to b550 because of 4.0 as it cost nothing compared to 4090, it is more about work invoved in switching it. Also i will upgrade my main SSD to NVM 2 TB 4.0 (from 3.0 500 GB). I am using that PC mainly for VR and in Flight simulator 2020 in dense city areas i get quite a lot of stutering, this upgrade should quite help with bottlenecks and it should hopefully solve that stutters. I dont think i will much notice improvments somewhere else than in Fsim 2020. Other mainly racing sims are running fine and are mainly gpu limited, but few more FPS could help in some places to be completly without stutters. VR should be about good imersion and stutters are quite ruining it, so this quite cheap upgrade should make it better and would be shame not to use 100% potencial of that expensive 4090. For non VR gaming i dont think that upgrade make sense, also i dont see much benefit using 4090 for playing games at 4k 60 Hz LCD, it is looking and running almost same as with 3080.

    • @kecimalah
      @kecimalah 2 года назад

      @Paul Denton As it is most powerfull CPU now for example for Fsim because of 3D cache and with that board it would support Pcie 4. But you are right, it looks soon should be on market new Ryzens also with 3D cache and DDR5, that should be probably quite faster and more prepared for future.

    • @mayacakil
      @mayacakil Год назад

      @@kecimalahwhat happened during this time? I also have 5800x3d and thinking to buy b550

  • @cars291
    @cars291 Год назад

    Awesome and thorough testing 👏

  • @BlueBoy0
    @BlueBoy0 2 года назад +4

    Really glad you did this one. There's a bug in Asus 500-series motherboards with 11th gen cpus and the 4090 where it only runs the card as PCI-E 3.0 (despite running the 3090 at 4.0). I've been wondering how much performance I'm losing.

    • @NaanStop96
      @NaanStop96 2 года назад +1

      I'm having a similar issue with a b550i itx mobo from Asus. I'm using a 5800x3d and with gen4 I can't get a display, only when it's set to gen3.

    • @BlueBoy0
      @BlueBoy0 2 года назад +1

      @@NaanStop96 Definitely a BIOS issue. I hope Asus still updates these motherboards...

    • @NaanStop96
      @NaanStop96 2 года назад +1

      @@BlueBoy0 likewise, no other issues however other than GEN4 causing no display so that leads me to believe everything else is working correctly and not defective. For now, I'm glad the performance difference is negligible.

  • @erfa10
    @erfa10 2 года назад +2

    Thanks for the continued barrage of extensive testing GN!
    I'm curious to know whether DLSS 3 with its higher framerates might be more impacted by the bandwidth... likely not considering that the CPU is only required for every other frame but might be interesting to confirm as upscaling becomes more common.

    • @sharathvasudev
      @sharathvasudev 2 года назад +1

      it shouldn't. the bandwidth is mostly needed to communicate with CPU. dlss 3 generation happens all in the GPU. it's infact cpu independent. higher vram speeds could help to access the on GPU data from previous frames

  • @scarletspidernz
    @scarletspidernz 2 года назад +4

    I want PCIE GEN 6 motherboards to change the gpu slots to using/reserving X8 instead of x16, maybe leaving it on extreme high end only for the professionals (for who know what cards) and take those extra 8 lanes into more discrete i/o rather than "shared with"

  • @corr2143
    @corr2143 2 года назад

    I love your videos, Im just a little confused that all this work was done to show the performance of 3 games would be nice to see some encoding software and rendering (Davinci Resolve and Blender) as well. None the less, appreciate the free resources.
    This channel taught a valuable thing like many others, benchmark and research your usecase online before purchases. No more buyers remorse and less ewaste, thanks for the work over the years.

  • @magottyk
    @magottyk 2 года назад +4

    PCIe generations is more relevent to boards that can bifurcate the GPU slot.
    If there's not much difference between gen 3 and gen 4 on a 4090 at X16, it would be useful information to have if the card is limited to X8 lanes in these scaling tests.
    One major use for bifurcated slots is direct to CPU m.2 SSD's add in boards, so the X8 bandwidth results would be useful.

    • @TryllHDTv
      @TryllHDTv Год назад

      This is exactly my curiosity!!

  • @declangallagher1448
    @declangallagher1448 2 года назад

    THIS IS THE EXACT VIDEO I NEEDED THANK YOU

  • @kirby0louise
    @kirby0louise 2 года назад +3

    Hybrid graphics is definitely something you could explore as a PCIe heavy workload. Even on desktop it's somewhat relevant as decoding videos on your iGPU is more power efficient than the dGPU + less risk of getting into a memory bandwidth fight with a game

    • @niks660097
      @niks660097 Год назад

      16x pcie 5 has more bandwidth than most DDR4 or LPDDR4 dual channel or LPDDR5 single channel laptops or iGpu desktops, nowadays pcie bandwidth is never the bottleneck with things like CXL, in encoding/decoding your SSD will always be bottleneck, not pcie bandwidth, and dGPU encoders and decoders like nvidia's nvenc can do multiple 4k60 decode/encode streams, even in AV1, so your comment is not valid in 2023, maybe 2015? but nowadays amd and nvidia is way ahead of intel "quicksync"..

  • @NukeMyHouse
    @NukeMyHouse 2 года назад +1

    Wow, RUclips compression was really put through the wringer at around 6:00 lol

    • @HB-622A
      @HB-622A 2 года назад

      Yeah, that was nasty. I paused to check the video's resolution setting to make sure it hadn't auto-reduced to 360p or something, but it's just the sheer compression.

  • @RoughRunnerAce98
    @RoughRunnerAce98 2 года назад +4

    In the real world this basically confirms for me that you can run close to the best CPU and GPU on an old B350 board that has PCIE 3.0 and only lose single digit percent performance from it. Hardware Unboxed showed that B350 boards can run the 5800X3D with little performance lose, within reason of course, you should pay close attention to your VRM temperatures and don't take it too far. I personally won't be going that far, the 5700x is the best 65W CPU on AM4 to my knowledge so that's what I will be upgrading to from my 1700 and now I know that PCIe 3.0 won't hurt at 1440p. Now I will wait to see how RDNA 3 and the the rest of the 40 series scales with the 5700x to upgrade my 1070, I wish GN did that testing but HU usually does great scaling videos as well, they did an excellent one for the 5000 series CPUS with 30 series and 6000 series GPUs. AM4 was truly a great platform for longevity. Thank you for doing tests that answer important questions.

    • @Littleandr0idman
      @Littleandr0idman 2 года назад

      Can confirm. Upgraded my 1600 for a 5700x on a b350 board. And upgrade my gtx 1070 with a 6700 xt. I actually had worse performance wifh the new setup until I realized that the bios update reset the ram’s clockspeed. Once I got it back up to 3200mhz, it’s been smooth sailing

    • @mapesdhs597
      @mapesdhs597 2 года назад

      Indeed. The same can apply all the way back to IB/IB-E CPUs on P67/Z68/X79, infact it was possible to force 3.0 even with SB-E on X79 (such as the 3930K, because it was a XEON in disguise). All one is then limited by for such older setups is overall CPU strength, but that's a whole other thing.
      I just find it interesting how Gen 3.0 ended up spanning so many years of different boards and sockets (well, mainly Intel), whereas now the industry seems to be skipping through 4.0, 5.0 and beyond a lot faster. People perhaps forget the utility of what once was, eg. my lowly old 4c/8t 4820K (so easy to oc) has 40 lanes of 3.0 from the CPU, so it can actually do multi-device/GPU things which much later SKUs like the 7820X could not (that only had 28), plus relevant mbds had lots of lanes off the chipset aswell, eg. my P9X79-E WS supports x16/x16/x16/x16, half from the CPU, half from the chipset via two PLX chips IIRC; or it can even run x16/x8/x8/x8/x16/x8/x8 using all seven slots. See:
      www.anandtech.com/show/7613/asus-p9x79e-ws-review
      It all became horribly complicated when for a time the no. of lanes from the CPU depended on the SKU (really horrible product segmentation, it meant slot device support and even whether some slots could be used at all depended on which CPU was fitted), eg. the original 4c CPU for X79 (i7 3820) had 40, yet the much later 5820K and 6800K only had 28. So glad when all that malarky came to an end.

  • @scherge
    @scherge Год назад +2

    As someone who really likes vertically mounted graphics cards, I was blown away by the price for a good PCIe 4.0 riser cable. 100+ € compared to the 15 € I paid for my ROG 3.0 riser cable, that's just insane. Board and GPU both support 4.0, but I am not willing to pay that much extra for a 0.5 to 3 percent performance increase at best. Thanks a lot for this very competent video. I will not buy a 4.0 riser before it provides any actual benefits.

  • @budgetking2591
    @budgetking2591 2 года назад +7

    The real problem with PCI 3.0 occurs when u have a graphics card that only has 4X or 8X bus, like many AMD gpu's. I have some performance loss with my RX6600XT on PCI 3.0.

  • @seanripperger
    @seanripperger 2 года назад

    Side comment. I appreciate these shorter videos once in a while. Makes it easier to fit in a break.

  • @HiHi-eq2tn
    @HiHi-eq2tn 2 года назад +6

    Larger margin than i expected was expecting 1-2% difference.

    • @TwistedEyes12
      @TwistedEyes12 2 года назад

      @Gamers Nexus shorts 🅥 @thanks 🅥 This spammer is actually the FieRcE channel on RUclips, going by "thanks" or Gamers Nexus shorts or Bully Maguire. Don't click it unless you want to give them a free view, very sad way to try to do this. I wouldn't have even cared if you just didn't try to copy and pretend to be Gamers Nexus including using their logo to trick people.

    • @GamersNexus
      @GamersNexus  2 года назад +4

      1% is pretty much error/variance in most instances. 2% is real, but still has +/- a bit of range.

  • @t0mn8r35
    @t0mn8r35 2 года назад

    Very interesting test and very interesting test results. Well done as always.

  • @andrewvirtue5048
    @andrewvirtue5048 2 года назад +3

    Explain why the quantity of PCIe lanes a device uses are important, and which devices use them, and how many can be used at once.

    • @Vile-Flesh
      @Vile-Flesh 2 года назад +1

      I would like to know this as well. I don't understand PCIe lane allocation and why they compared 3.0 x16 to 4.0 x8 and now I am further confused when he mentioned lanes from the CPU and lanes from the chipset--pulling lanes from the chipset was possible but not often done. Where are the options for that?

  • @riba2233
    @riba2233 2 года назад

    was waiting for this, thanks!

  • @croakingembryo
    @croakingembryo 2 года назад +4

    Wouldn't the main difference be in loading the game? Like how fast the memory gets populated?

  • @hamzasamar8823
    @hamzasamar8823 2 года назад +1

    thank you so much this is so informing

  • @Co-opSource
    @Co-opSource 2 года назад +5

    Please cover VR gaming performance too. 🙏🏽

  • @AnAngryRedGummyBear
    @AnAngryRedGummyBear Год назад +1

    As someone with a x370 mobo, a 5800x3d, and considering a 4k series card at some point in the future, thanks for this demo. This gives me the confidence to try and take the same mobo for a whole decade - I built this system in 2017.

    • @defranken
      @defranken Год назад

      I have the same except for a 5900X. I built the original system in 2017 as well with a X370 Taichi. I'm now running a 7900XT and works perfectly good.

  • @cdurkinz
    @cdurkinz 2 года назад +4

    I feel like this will matter more if you have ANY other pcie card in your system and are running x8. Would love to see those.

    • @davidcobra1735
      @davidcobra1735 Год назад +1

      It doesn't matter. You either have enough lanes total for all the cards or something doesn't work.

    • @ytmaxxammo8591
      @ytmaxxammo8591 Год назад

      Period….Well said. ✅

  • @techclub8528
    @techclub8528 Год назад

    Thanks for the video, I’m upgrading my board this month to a Gen5 AM5 my 4090 is currently using a modified vertical gpu riser that’s Gen3 I can’t really swap this out as I would need to redo all my hard tube runs, glad to know there won’t really be a difference and I can use the same riser.

  • @jgorres
    @jgorres 2 года назад +5

    Since this testing shows that the bandwidth difference between PCIe gen 3.0& 4.0 is negligible at x16, I'd be interested in seeing how PCIe 4.0 x8 would do... Will fewer lanes have latency issues, or expose driver problems?

    • @12Burton24
      @12Burton24 2 года назад

      PCI E 4.0x8 will have little advantages ovet PCI E 3.0x16 because CPU lanes and m.2 Drive Lanes are also PCI E 4.0 the Bandwidth itself is the same foe 3.0x16 vs 4.0x8

    • @jgorres
      @jgorres 2 года назад

      @@12Burton24 Yes, I said the bandwidth was essentially the same, but testing would show any latency or driver issues with just 8 lanes, and that's what I'm curious about.

    • @12Burton24
      @12Burton24 2 года назад

      @@jgorres And i said that everythingnis faster so even drivers are the same you should see a difference 😉

    • @jgorres
      @jgorres 2 года назад

      @@12Burton24 ??? I'm not understanding what you're saying.

  • @Siegdrifa
    @Siegdrifa 2 года назад

    Exactly what i was looking for ! thanks !

  • @xHighPotencyx
    @xHighPotencyx 2 года назад +4

    I feel like this is as good a place as any to ask: on motherboards where the PCIe 5.0 x16 slot goes to x8 with the PCIe 5.0 M.2 occupied, do you only get an effective 8 lanes of PCIe 4.0 with a 4.0 device? That is to say, are you limited by both interface compatibility and lane bifurcation simultaneously?

  • @irorules
    @irorules 2 года назад

    Hey just a comment on the Case ADs. I totally get ads and have no problem, but i would prefer if you only advertised cases that have been tested. They dont have to be high performing results, but being able to see a GN level video for the case would be more helpful than just an ad spot. i also understand conflicts of interest, but in some case Ads (bequiet), you mention it performing well in testing. Love the work you guys put in!you

  • @5371W
    @5371W 2 года назад +3

    I would drop the coin on a 4090, but at this point my entire computer is a bottle neck.

  • @maxabillon
    @maxabillon 2 года назад +1

    Nice one Steve,👍 I have a Z690i ITX Gigabyte lite motherboard on order & only supports up to PCI-E 3.0 x 16 slot

  • @dfwruss2392
    @dfwruss2392 2 года назад +17

    I'm not sure how many are gaming with a 4090 at 1080p. More would be gaming at 1440p and the target is 4k I would imagine. I wonder what the difference would be when 4k gaming is factored in.

    • @randomyoutubeuser8509
      @randomyoutubeuser8509 2 года назад +4

      Nobody who spends that kind of money on a 4090 should game on anything less than 4k especially when the 4090 will barely be utilized in 1080p and 1440p in a lot of games

    • @amkyutube
      @amkyutube 2 года назад

      Still, 3.0 vs 4.0 question clearly is an optimization issue rather than a bottle-neck one.

    • @dustingarder7409
      @dustingarder7409 2 года назад

      @@randomyoutubeuser8509 I have a rx 7900xtx and i know that it is slower but not by a lot. I am Playing Escape from Tarkov and Satisfactory (with mods). Escape from Tarkov runs on almoast maxed settings in 1440P with about 55 Fps (On the newest map). Satisfactory runs moast of the time stable at 144fps but has drops to 100. The maximum benefit of the 4090 that i can imagine is 15% more fps becasuse i am not using rt or dlss/fsr. What I want to say is that many games dont even run in 4k. Escape from tarkov could work in 4k with lower settings and dlss 3.0 but this game is so poorly optimized that dlss looks garbage in this game and makes it litterally unplayable

    • @HappyHubris
      @HappyHubris 2 года назад

      @@dustingarder7409 The 4090 is generally 30-65% faster (depending on settings and features), not 15%.
      ruclips.net/video/f1femKu-9BI/видео.html
      I use 7900XTX in 3440x1440 and definitely wouldn't mind a 4090, but that's just too much for a GPU!

  • @Ryan-1337
    @Ryan-1337 2 года назад +2

    My B350 board not only got support for Ryzen 5000 but it got a bios update for Re-Bar as well. I'm glad I can stretch out pcie gen 3 for a bit longer and not lose any meaningful performance.

  • @Marc.Google
    @Marc.Google 5 месяцев назад

    I have been wondering about this for a little while, searched the answer and of course Steve has this covered! Great info, thanks for filming this @GamersNexus

  • @dorfkind9825
    @dorfkind9825 2 года назад +3

    The difference between PCIe 4.0 x16 x8 x4 would be interesting

  • @Xeonzs
    @Xeonzs 2 года назад

    Very useful video, I was going to run into the issue today when I considered pairing my 4090 with a Auros 8TB SSD AIC, after this video and some other research I found running it 8x/8x wouldn't be an issue for the 4090, but the drives would be an issue cause 1-2 of the NVMe ssds on the AIC wouldn't be detected, while other models would run at lower speeds due the bottlenecked bandwidth.
    Decided to not go with the AIC and instead just re-use my current NVMe SSDs and plop them onto the motherboard directly.

  • @ZiddyN
    @ZiddyN 2 года назад +6

    So its about a 1% to 3% difference. Is it worth not slotting an m.2 into the same lane to keep your 4090 GPU on x16 lanes gen4 then?

    • @csguak
      @csguak 2 года назад

      Difference is minimal, you can just use M.2 on the same lane

  • @Sargatanas2k2
    @Sargatanas2k2 2 года назад +2

    I love the "Joe rules, Steve drools" sign in the background 😂.

  • @TheBlueBunnyKen
    @TheBlueBunnyKen 2 года назад +5

    I have a z490 motherboard and read that m.2 speeds don't improve gaming much. It's mainly for transferring files, I also learned that pcie 3.0 and 4.0 isn't a big enough difference for gpu bottleneck. It's mainly bandwith related

    • @paullasky6865
      @paullasky6865 2 года назад

      And I have the z390. And using 2 nvme drives downs affect the 16 lanes on the first piece port.

    • @Skippernomnomnom
      @Skippernomnomnom 2 года назад

      @@paullasky6865 what?

    • @paullasky6865
      @paullasky6865 2 года назад

      @@Skippernomnomnom people keep saying that if you use both m.2 drives you lose pice lanes on the main slot. It isn't true.

    • @Skippernomnomnom
      @Skippernomnomnom 2 года назад

      @@paullasky6865 Yeah. I have two nvme drives and 4 HDD drives on pcie 3.0 and have no issues

  • @Flank.Sinatra.
    @Flank.Sinatra. 2 года назад +1

    when will you guys post 2022 best cases and coolers? maybe also best fans if you guys have time? looking forward. love what you do

  • @JDHitchman
    @JDHitchman 2 года назад +3

    Steve, just curious if you have ever considered building and selling a "GN Mark" hardware benchmark test suite along the lines of the upcoming LTT "Mark Bench"?

    • @GamersNexus
      @GamersNexus  2 года назад +9

      No, we use all our stuff internally only. It's dangerous to build and distribute that type of thing without an insane amount of controls (that we won't have over user setups) just because it can easily produce bad data that runs rampant for comparisons online - e.g. user benchmark. It's possible. FutureMark has done a good job. But we're not going to try and do that as we aren't set up to do it in a way that I think is responsible (e.g. would be too easy to generate bad data that feels like good data to the users)

  • @VGSoniTech
    @VGSoniTech 2 года назад

    Thanks i have i9 9900k z390 i was really planning to upgrade my gpu to 4090. This video was really helpful!

  • @BlackJesus8463
    @BlackJesus8463 2 года назад +3

    Gen 3 is legendary!

    • @m8x425
      @m8x425 2 года назад

      At least you watched the video.

  • @Tainted-Soul
    @Tainted-Soul 2 года назад

    Thanks for this . It is good that the board speed is a head of the need of the graphics cards, showing that this will not be the bottle neck for years to come :)

  • @alouisschafer7212
    @alouisschafer7212 2 года назад +3

    So even with a 4090 the most powerful GPU on the market the difference between 3.0 and 4.0 is marginal.
    Now that is interesting because it shows that PCIe is very future proof.

  • @Nathanael_Forlorn
    @Nathanael_Forlorn 2 года назад +8

    Would have loved to see what happens to fps, when limited to x8 lanes.
    With m.2 becoming more widespread, many boards limit from x16 to x8.

    • @JacobMartin-l1f
      @JacobMartin-l1f 2 года назад +3

      Pcie 4.0 x8 is equal to Pcie 3.0 x16, so according to these video's charts, nothing changes

    • @Nathanael_Forlorn
      @Nathanael_Forlorn 2 года назад

      @@JacobMartin-l1f oh is it? Kinda makes sense, but didn't know for certain whether bandwidth precisely doubled or even if it correlates linearly at all. Thanks for letting me know!

    • @DasFlank
      @DasFlank 2 года назад

      @@JacobMartin-l1f Im looking at a 4080 for an upgrade from my 2080ti. I have an nvme boot drive which is forcing my top slot into x8. It's an x470 board and gen 3 to boot. I feel a bit silly asking this. But is it a good idea to upgrade to an x570 board with pcie gen 4? Or am I worried about nothing

    • @flipdry
      @flipdry Год назад

      ​@@DasFlank It may have some impact but how much I can't say. I would either get a b550 board or move the m.2 SSD to a PCIe X4 adapter bracket to allow the top slot to operate at x16 speeds.

    • @Justifier
      @Justifier Год назад +1

      This comment actually has more relevance than you'd think.
      Lots of z790 boards, such as the z790 Aorus Master, run their PCIe lanes in x8 if you have a m.2 in the top CPU slot. There's going to be a ton of people who want to run Gen5 m.2 storage when it becomes more widely available, how much is this going to impact GPU performance if at all on these boards?

  • @ivaniliev2272
    @ivaniliev2272 2 года назад +1

    rtx 4090: One kidney, please!
    German taxes and rents: Hold my beer!

  • @seanpasquarella4149
    @seanpasquarella4149 2 года назад +3

    So would it be preferable to plot an upgrade path through the 40 series if im going to stay at 4th gen PCIe for a little while? Or would it be smarter to look into high-end 30 series?

    • @charaznable8072
      @charaznable8072 Год назад +1

      Um watch the video? You will get your answer...

  • @x-techgaming
    @x-techgaming 2 года назад

    Wow, actually relevant in-video product advertisements... THAT'S refreshing!

  • @raulitrump460
    @raulitrump460 2 года назад +4

    you get cpu bottleneck with older cpus before you hit pcie limitation

  • @basbas63
    @basbas63 Год назад +2

    Good to see PCIe 3 is still good for the foreseeable future.
    On AMD B550 though gen 4 is giving me issues. Think B550/X570 motherboards or the chipsets themselves were not really ready for PCIe gen 4 with the issues it had and has with gen 4.