Dual GPUs for OBS does NOT work - This suggestion will HURT your performance! Stream Guides

Поделиться
HTML-код
  • Опубликовано: 27 окт 2024

Комментарии • 542

  • @marshallr.8121
    @marshallr.8121 4 года назад +136

    "people keep asking, i keep telling, and they wont believe me"
    ah yes, reminds me of working in IT-support

    • @mattsmechanicalssi5833
      @mattsmechanicalssi5833 4 года назад +5

      Being a Mechanic of 30+ years, I can say that my industry is not that much different!

    • @patrickgronemeyer3375
      @patrickgronemeyer3375 4 года назад +2

      Ya because he never tested h.264. he honestly should have did it with like a GTX 1080 as your main GPU and like a 9 series as your 2nd. Not every one has a 2080. Or even 2050. pretty sure there are mad gainz to be had with h.264

    • @TimSheehan
      @TimSheehan 4 года назад

      @@patrickgronemeyer3375 poe's law

    • @suico778
      @suico778 2 года назад

      People wanna see proof! You could call it annoying, or you could call it capital. All depends on what you do with it. Epos here did the latter, and we're thankful. It's a win-win. :)

    • @marshallr.8121
      @marshallr.8121 2 года назад

      @@suico778
      and generally there's nothing wrong with wanting to understand things, it's encourage it.
      the difference is asking/researching why
      it is recommended
      vs just asking what to do 👌
      if you're trying to just figure out the thing to do
      and you ask for advice which you then get
      from someone knowledgeable
      you should accept that

  • @FatheredPuma81
    @FatheredPuma81 4 года назад +31

    Actually if you use a 2nd GPU for encoding it will result in less lag on the viewers end if you have a weak GPU and are stressing it heavily. Trust me I know first hand and 2nd hand. Saw a Streamer who was using NVENC play a game and the stream was laggy. Told him to switch to Quicksync and the lag was gone never to be seen again. Switched from a CPU with integrated GPU to one without, used the CPU for encoding and had issues, switched to NVENC on my GTX 960 and when I cranked the settings on Black Ops 4 the video I was recording went to around 3 FPS while I was still getting 30-60 ish.
    Should probably try this out on older graphics cards instead of the top of the line.

  • @KadargoGaming
    @KadargoGaming Год назад +5

    I don't agree with it. In my LAB, I did a few tests with i7 2nd Gen and i7 1st Gen, with second GPU as encoder, and everything were smooth. DID NOT affect the performance on those OLD system, streaming live. I am a computer technician since 1991. I give good FAITH of my words.

  • @SLAY3R8888
    @SLAY3R8888 2 года назад +27

    First of all EposVox, thanks for this video along with all your other informative content. I am going to share my setup and why using a dual GPU streaming setup specifically benefited me, but this is not meant as an argument against the points that EposVox made in this video, but more of pointing out some exceptions to these points. Maybe this will help someone else in my situation looking to do this. I am also keeping in mind that this video is 2 years old, so there may have been some developments in hardware / bios / OBS since then. I have a 6900xt as my primary gaming GPU, and I have never really worried about not having the NVENC encoder, because I never had the internet upload bandwidth to stream up until recenty. My 6900xt does a great job recording gameplay at high bitrates in H265. Now that I have 100 mbps upload speed, the 6900xt can also handle streaming in H264 just fine, but obviously there is desire for more quality, even when streaming to YT at high bitrates. The AMD AMF H264 encoder is just not that great. This made me really want to try connecting a 3060ti that I already had, as a 2nd GPU to my gaming PC and use it for stream encoding. This is how I did it, and this it why it worked: I have an Asus TUF X570 mobo, and a R7 5800x. My mobo has an option to run a dual GPU configuation in x16 (Gen4) mode for the main GPU, and in x4 mode for the second GPU (x16 / x4 mode). I don't know why everyone keeps saying that dual GPU will only run x8 x8 mode, when my PC is a clear exception to this. This may have changed as newer CPUs and Bios updates came out over time, etc. My mobo manual says this mode is available for both Ryzen 2nd gen and 3rd gen CPUs. Anyway, while this worked great, and I did confirm with GPUz that I am getting PCIe 4.0 x16 for the main GPU, and x4 for the second GPU, I didn't like having the second GPU in the case for obvious reasons. It makes the main GPU above it get a lot warmer. So then I decided to try using a 1x PCIe riser (the same ones that are used for GPU mining rigs) connected to the second x16 slot, thus allowing the encoding GPU to be mounted outside of my case. On paper, any gen of PCIe supports enough bandwidth on 1 lane to support even a moderately high encoding bitrate such as 40,000 kbps. So I did some testing, and it turns out if you have the second x16 slot set to at least Gen 3 in bios, then it has enough bandwidth in 1 lane for this to work great. Gen 1 did not have enough bandwidth for this to work, and gave me tons of skipped encoding frames (not sure about Gen 2, I can't remember. I ended up just using Gen 3). After tinkering in the bios, and getting this to work properly, I am able to stream and record my gameplay to RUclips (I game in 4k and stream / record in 1440p), using a 40k bitrate, with NVENC encoder, and the 3060ti is handling all of the encoding load. I confirmed this with Task Manager several times. No skipped frames or issues either. I think it may be reverting to old NVENC, as mentioned in this video, and using my 6900xt for the scene compositing, which is totally fine. I did confirm with GPUz that I am definitely getting x16 PCIe Gen4 on my main GPU, while getting x4 mode (x1 because of riser) on the second GPU. So I did not lose any gaming performance, in fact, I did some benchmarking before and after and I definitely gained some performance by taking the encoding load off of the 6900xt. One other wierd thing that I wanted to mention -- for this to work for me, I had to leave GPU 0 selected in the NVENC encoder settings, it wouldn't work with GPU 1 selected. However with GPU 0 selected, it was definitely using GPU 1 for the encoding, as I have double and triple checked this in Task Manager, etc. My 6900xt is doing zero encoding now. I am REALLY happy that I tried this, because it kept me from needing to buy a different GPU or switch to a dual PC setup. My setup works great because: 1.) My system doesn't have to run the main GPU at x8 mode when 2 GPU's are connected, 2.) My main GPU is an AMD card, so it benefits more in gaming performance from having the encoding offloaded to a second GPU, 3.) I have the encoding GPU mounted on my desk outside of my case, so it doesn't effect thermals, 4.) I now have a higher quality encoding for my stream / recordings using NVENC encoder (which was my main motivation in doing this). Even if it is using NVENC old, it is still superior to AMD AMF H264 quality. So if you have an AMD X570 system with a Gen 2 or higher Ryzen CPU, and an AMD GPU, and an inferior model Nvidia GPU laying around that is new enough to do NVENC encoding, look at your mobo manual and see if this is an option for you. I know these are a lot of IF's, but I just thought I'd share this incase someone else in my specific situation was thinking of doing this. Dual GPU's for streaming CAN work and it CAN benefit your stream and gaming performance. You just have to have a very specific set of hardware for it to work correctly or even be necessary in the first place. One last point from this video that I wanted to address -- game capture in OBS works just fine, and I have tested this in multiple games including COD Warzone, and had no issue detecting the game capture. I'm not sure why this was an issue with EposVox and not for me, but maybe OBS has changed some things in the last few years. If I ever want to display capture, I can always switch encoding over to the 6900xt for something like that, no big deal. I haven't tested Display Capture yet because I rarely use it anyway.
    I am open to anyone's thoughts / critiques / questions about my setup, I am really not here to argue with anyone or prove anyone wrong, just sharing what has worked for me, in hopes that this information will help someone else. I am also not an expert, just a nerd with a keyboard. Feel free to tell me if I got something wrong here. Thanks.

    • @SLAY3R8888
      @SLAY3R8888 2 года назад +6

      Update: You can also buy a long x16 pcie riser cable to get the full x4 connectivity on the second GPU, if the x1 riser isn't giving you enough bandwidth to handle your encoding. The 1x seems to be enough to handle around 40k bitrate at 1440p, but I ended up using a 3 foot long x16 pcie riser cable to get the full 4x connection on my second GPU. This allowed me to record/stream at even higher bitrates, even at 4k @ 50k bitrate. It was also long enough for me to still have the second GPU mounted outside the case. Still running x16 @ gen 4 on the main GPU, and x4 on the second (encoding) GPU.

    • @MATechPH
      @MATechPH 2 года назад +9

      @@SLAY3R8888 can confirm that this works, I currently have a spare 3090 and using a primary 6900XT for gaming.
      Yes, the 3090 is overkill but I wanted to try if it will work as I lose some frames when I start streaming.
      Currently connected the 3090 to a mining Riser as I don't have a PCIE gen 3/4 full riser cable.
      Planning to make a video regarding this as my first video on youtube. Currently still testing and working on it but it works fine.

    • @warlordianx23
      @warlordianx23 2 года назад +1

      Thanks man 😊 glad to know this

    • @nemmayol
      @nemmayol 2 года назад +2

      @@MATechPH man we all have to many spare gpus now

    • @CuervoVirtual
      @CuervoVirtual Год назад +3

      I can confirm that this also works for me, I have a Mobo x570 Asus TUF gaming pro, and in the main pcie slot I have an Rx 6650 XT and in the second one I put a gtx 1050 that I kept, the AMD card is at 8x 4.0 pcie since the card is like this (surely with a x16 it would work without problem), and the other pcie slot is at x4 with the gtx 1050 enough to use the Nvenc codec, that's what I did in tests and I can stream with the gtx 1050 using Nvenc while I play with the Rx 6650 XT, everything is stable, and the old Nvenc gives me better quality than AMD's hw h264 codec despite the fact that AMD codec has improved a lot in recent months, but Nvenc series 1000 generation continues to beat it, so I'm happy from my setup, i also benchmarked in games and there is no performance loss. To play Rx 6650 XT, for streaming the small gtx 1050 with Nvenc, to record gameplay the Rx 6650 XT with h265 AMD codec, and for video editing I still can't decide, I think Nvenc also gives something of better quality in Adobe premier and sony vegas.

  • @Phynellius
    @Phynellius 4 года назад +20

    not sure how you think gen 4 pci-e is going to do anything for your bandwidth since you're using gen 3 cards

  • @worldofjoseup
    @worldofjoseup 3 года назад +7

    I want operating softwares to have better multi-GPU support. I feel like it's an opportunity that's really missed out on. It would save the annoyance of having to run 2 different PCs and allow someone to just have 2 GPUs, which is debatably cheaper. This can also work for laptops such as the ROG Flow X13, which has an integreated GPU in the 5900HS, an RTX 3050 Ti, and an external RTX 3080. Not only would that benefit laptop streaming, but also allow people to have computers they can use for gaming and streaming wherever they want.

  • @JickFincter
    @JickFincter 4 года назад +15

    PSA: Any of you who have a second GPU. You can use that second GPU to run RTX Voice. I made a video that shows you how to run it on your secondary Nvidia card it does not have to be an rtx card!

  • @jcrab72
    @jcrab72 4 года назад +23

    I am running a dual GPU setup and I have seen and significant performance boost vs a 1 gpu setup. I use a ROG gtx 1080ti 11gb for gaming and a ROG gtx 960 4gb dedicated to OBS encoding. My system I9-9900k, 32gb corsair RGB pro 3200mhz DDR4 ROG gtx 1080ti 11gb ROG gtx 960 4gb mobo-ASUS Z390 ROG Maximus XI Hero WiFi. I think there are too many variables to definitively say that more than one GPU "will HURT your performance!" it might on the setup you were using but it doesn't on the setup I use.

    • @JuliaYamYam
      @JuliaYamYam 4 года назад +1

      My theory is that it's because 1080 Ti nvenc hits ram anyway, so the problem he tries to point out with unnecessary bottlenecking is present there also on nvenc. And obviously taking some work off the main GPU should in theory help, it's just that in RTX case it's not worth it. Anyway I'll be getting 1080 Ti next week and if all works good I'll be sure to test it with my old 1070, see if that makes any difference for me as well :3

    • @ajinkyap7102
      @ajinkyap7102 4 года назад +1

      Julia did you test it?? I have a gtx 1070 which i am unsure of selling coz i am thinking of using it to record my gameplay via OBS and play on a primary 3000 series card!

    • @davetechme
      @davetechme 4 года назад +1

      I also found that dual GPUs does work, at least in my case. The difference is small and performance definitely doesn't hurt, but your wallet will.
      I've got dual GTX 1080s paired with a Ryzen 1700 @ 3.9GHz and 32GB of RAM @ 2933MHz. SLI sucks so I keep it disabled until I need it. Until I can get my hands on a 3080, I decided to make the secondary GPU do the OBS encoding.
      I can't speak for streaming, but recording performance is marginally better when using the secondary GPU. In R6 Siege's benchmark tool, there's a 3% boost in FPS, 5% decrease in GPU rendering time, and 3% decrease in CPU rendering time for my arbitrary in-game settings at 1440p.
      Looking at x8 vs x16 benchmarks, I'm not seeing much, if any, difference in framerates in the majority of game titles so I doubt there would be much difference if I were to physically remove the second GPU (eliminating the possibility of using SLI in titles that support it).
      Now, if I didn't have a spare 1080 collecting dust (outdated but no slouch either), I would not say it's worth it to purchase a second GPU to do the encoding.
      But, if you're considering building a second PC, it might be worthwhile to try a second GPU first.

    • @garloch
      @garloch 4 года назад +2

      Main problem with Epos build/approach vs yours is: he is using a 2080ti, the first gpu to get close(like really close) to the PCIE 3.0 x16 bandwidth limitation, obviously forcing a gpu with that kind of bandwidth needs in 8x mode will create a bottleneck and decrease performance, using a "weaker" gpu like yours? you will be perfect

    • @CaptainRocky
      @CaptainRocky 3 года назад

      how are you guys dealing with temps inside the case with dual GPUs?

  • @jerjac25
    @jerjac25 4 года назад +6

    Can't remember if I posted this or canceled the post... But I run a 2080 FTW3 ultra as my GPU 0... I run my old 980Ti as GPU 1.... OBS is set to use GPU 1 as the encoding GPU. I record 3440x1440 (I prefer 2560x1080 Off the top of my head I think it's 2560.), 60fps at either 50mbs per second or Lossless.... I record 4 audio streams. I get 0 frame loss in this mode.... even with the preview turned on. I typically capture games running Unreal engine.
    If I set the encoding GPU to 0 and try to do it all on the 2080 I then start noticing rendering lag and dropped frames. It is night and day and clear as crystal having 2 separate GPU's does indeed give me better performance.
    I do run as admin and I have Game mode disabled in windows 10. (this does make a big difference).
    I was actually thinking about swapping the 980ti for a Quadro for HDR recording.
    Just curious and wondering if this video just applies to older graphics cards or lower tier graphics cards.
    (Maybe different motherboards plays a huge role in how this all effects performance, maybe how PCIE lanes are addressed or timed or something when running 2 cards). I run a Maximus Hero Vi.
    Maybe the games them selves play nicer with dual cards vs single cards when capturing. I know 7 days to die is a lot more finicky at ultra settings then say Dying Light is at ultra.

    • @NUCLEARARMAMENT
      @NUCLEARARMAMENT 4 года назад

      Trying to record 7680x4800/4:2:0/8-bit/60p using AVC encoding via NVENC, by splitting up my desktop into four separate 3840x2400 resolution quadrants, with each quadrant being part of its own encoding session on one of my four 1080 Ti blowers in 4-way SLI. I think I am going to get this working eventually!

  • @TheUnquietWriter
    @TheUnquietWriter 4 года назад +9

    Thank you! Was about to do this and you’ve saved me a lot of time and frustration! Will stick with my single PC setup until I get a second PC built for streaming.

    • @TheArgoBots
      @TheArgoBots 3 года назад +1

      You just didn't learn anything new.

    • @warlordianx23
      @warlordianx23 2 года назад

      Use NDI once you have a 2nd pc

  • @ViewTube_Emperor_of_Mankind
    @ViewTube_Emperor_of_Mankind Год назад +8

    It's weird but this was not true for me.
    I had 2x2080Ti and especially when I streamed or recorded Cyberpunk2077 there was a huge FPS difference if I specifically set the second, unused GPU für capturing/streaming. It made a difference of up to 30%. Especially on the higher resolutions 2/4K .. maybe the HDR has something to do with it?

  • @WraithWTF
    @WraithWTF 4 года назад +17

    05:55 PCIE slots on a mobo run at the speed of the device plugged into them, they don't magically upgrade your device to run at the highest speed that they're capable of. Even though the slot is Gen 4 capable, the card is only Gen 3 and therefore will only run at Gen 3 bandwidth. You won't see any performance boost from an x570 board in this regard.

  • @SeaOfTorment
    @SeaOfTorment Год назад +3

    Hey EposVox! I know you made this video 3 years ago, but I was really considering getting an intel gpu as my secondary gpu in my system (primary gpu is a Nvidia 4060) I was wondering if all these concerns carry over to my case? I want to use the intel card just for stream encoding and normal local encoding like with handbreak and ffmpeg, I think they still apply but I'm having a hard time grasping everything and I'd like to confirm with you just so I can conclude with my decision on getting an intel card for the sake of encoding (I know 40 series has AV1 aswell, but I'd like both flavors of AV1 just for playing around)

  • @tehmoriz
    @tehmoriz 4 года назад +2

    btw, PCI-e 4.0 will only give you increased bandwidth if and only if your device is also PCI-e 4.0.
    a RTX 2080 running in 8x mode will only give you the bandwidth of PCI-e 3.0 8x, even if the motherboard has PCI-e 4.0 slots.
    AFAIK, there are no X570 motherboards that can split a 4.0 8x link into two 3.0 16x links.
    also, i wonder if the situation changes if i introduce a capture card into the mix:
    -use second GPU to render and encode based on input from a capture card, using a mirrored screen setup from the first GPU.
    -use Unraid to run two instances of windows, assign a GPU to each VM, and stream on the second VM via capture card and GPU.

  • @AGeekyGuy
    @AGeekyGuy Год назад +1

    I was able to get a huge benefit using dual 3090s. One GPU running the OBS for streaming and recording while the other runs COD:MW and COD:MW2. The game alone can sometimes come close to maxing out the GPU while gaming with max settings.

    • @magog6852
      @magog6852 Год назад

      any pci lane issues? what mobo you have?

  • @CuervoVirtual
    @CuervoVirtual Год назад +1

    I can confirm that this also works for me, I have a Mobo x570 Asus TUF gaming pro, and in the main pcie slot I have an Rx 6650 XT and in the second one I put a gtx 1050 that I kept, the AMD card is at 8x 4.0 pcie since the card is like this (surely with a x16 it would work without problem), and the other pcie slot is at x4 with the gtx 1050 enough to use the Nvenc codec, that's what I did in tests and I can stream with the gtx 1050 using Nvenc while I play with the Rx 6650 XT, everything is stable, and the old Nvenc gives me better quality than AMD's hw h264 codec despite the fact that AMD codec has improved a lot in recent months, but Nvenc 1st generation continues to beat it, so I'm happy from my setup, i also benchmarked in games and there is no performance loss. To play Rx 6650 XT, for streaming the small gtx 1050 with Nvenc, to record gameplay the Rx 6650 XT with h265 AMD codec, and for video editing I still can't decide, I think Nvenc also gives something of better quality in Adobe premier and sony vegas.

  • @Suaven95
    @Suaven95 4 года назад +5

    Im not entirely convinced. I have had a 2 gpu setup for quite some time and dont have the droop in performance you speak of. I will admit that i dont run any monitors out of my 2nd gpu. My setup is an RTX 2070 with a GTX 1060 3g. I have 2 monitors and i run both of just the 2070. And the only time i had an issue where game capture didnt pull in the game feed was when i had the game and obs installed on 2 different drives. Could you could run the test again with no screen plugged into the 2nd gpu? For science?

  • @sethwilliamson
    @sethwilliamson 4 года назад +1

    That PCIe lane issue is what finally pushed me to just run 2 PCs. Especially since I was needing other expansion like a faster network card, an HBA, capture card, 3 peripheral monitors that needed to be driven somehow, thunderbolt hardware, PCIe NVME... Lanes were an issue. What surprised me was that having two machines not only works better, it has worked out cheaper than trying to have one machine to rule them all. So now I conceptually think of the game PC as a console. Hardware and software are chosen to just play games, period. Everything else is in the workstation.
    Now that we're seeing more lanes on some of the newer architectures where historically you'd have to seriously sacrifice clock speed to get it, I may revisit it, but I've been pretty happy with it as is.

  • @Qedhup
    @Qedhup 4 года назад +3

    This is a really good video. You definitely deserve move views and upvotes! I'm liking your channel so far. Gonna spread it around :)

    • @EposVox
      @EposVox  4 года назад

      Thank you! Please do

  • @KosmicKrow
    @KosmicKrow 2 года назад +1

    Thanks for the video and thorough explanation. Trying to find actual information on the subject of dual GPU usage is really rough.
    I've been using two cards, an RTX 3060 and an RX 5500XT, for a while now. The 5500xt runs two monitors, a 32" 2k at 120hz and a 22" 1080 at 60, for maps, net, social, etc. while the 3060 handles gaming on my main monitor, a 1080 at 120hz. The 5500xt was a spare so I figured I'd give it a try.
    Overall, it works well enough, but no benchmarks have been ran in a while. Looks like I've figured out what I'm doing this week
    Also, I have games report incorrect 4k resolutions. Particularly in the NVIDIA GeForce application. I figured it was something I was doing wrong.

  • @unibrowser1
    @unibrowser1 3 года назад +5

    Can this be revisited? Or is this still set in stone?

  • @Goodmanperson55
    @Goodmanperson55 4 года назад +3

    This showed two Nvidia cards and the final conclusion is pretty understandable. You're essentially using the same exact encoding chip while adding more complication. But I'm wondering, what if I have a Radeon card but I want to use Nvenc? Would slotting in something like a GTX 1050 as a dedicated encoder card be worth it?

    • @QuickLikeVic
      @QuickLikeVic Год назад

      That’s what I want to know too

  • @articparadox34
    @articparadox34 4 года назад +3

    turns out it does work just set the hardware encoder to the old NVENC run streamlabs as administrator set the gpu as 1 and plug your main monitor into the second gpu and your second monitor into the top gpu and it works perfectly fine without sli
    I tested it and it worked perfectly fine with the Max Quality Preset, Profile set to high, GPU set to 1 Max B-Frames set to 2 CBR set to 8000 and Force Gpu as render device in the advanced tab. Color format is set to NV12, YUV color space set to 709 and YUV color range to full.

    • @potter_productions
      @potter_productions 3 года назад

      This seems I testing, I always find it odd that give u the option to stream and stuff from another gpu

  • @milesbarrineau803
    @milesbarrineau803 4 года назад +1

    So I know I am finding this video 6 months after it posted so rip but I have been streaming to my grand audience of 2-5 people recently and some of the games I play completely max my poor 1070 at reasonable setting and high refresh rate (185 hz 1080p in my case) so I watched a few videos and did some research then watched this video and decided I would still give it a try using my old 1050 ti. While on my end for my games and whatnot there wasn't any change (which I wasn't expecting there to be) on the viewer end of things the quality jump and consistency improvement was huge. I get where you are coming from in this video but if you have older cards and want to help your primary card not completely get killed it actually does help a ton on the viewer side of things.

  • @NikosNisyros
    @NikosNisyros 4 года назад +2

    Hey there mr. EposVox! I am glad that you still work on these kind of videos! Quick question: Can one encode in H.265 7680x4320p 24 fps using one of the GTX 16 Series GPUs or one of the RTX 20 Series GPUs, since those suppose to use the new NVENC Turing SIP core? (Preferably using OBS). In the past I recall you tried to use cheapest Quadro Pascal because it is said it had support for 8K HEVC encoding but nothing really was working as it should. Thanks!

  • @lalljp
    @lalljp 2 года назад +4

    It's false that NVENC encoding has no impact on performance. I have an RTX2060S and use 4:4:4 color and record 4K and CUDA usage is usually 40%, and encoding alone uses 60 watts. Nvidia's claim of "separate" is just plain false. NVENC may be able to handle low res 4:2:0 without CUDA

  • @zlz5619
    @zlz5619 2 года назад +3

    For example, today’s in 2022 intel released their ARC gpus that have AV1 Encoder for streaming. Now I have an 3080Ti that doesn’t have AV1 Encoder, just dencoder, just the series 40 and new Radeon 7000 has this codec for streaming. I dont care if i loose a little bit of frames but do i can do multiple GPU to use the intel for encode using av1 to have more quality?

  • @zZapStrapZz
    @zZapStrapZz 3 года назад +3

    Glad I found this cuz I was just wondering if a cheap second graphics card for encoding in obs would help my game performance

  • @CalebAble
    @CalebAble 4 года назад +6

    Yea, it works great for me buddy. I allocate OBS to my integrated AMD vega 8 graphics card, leaving my gpu entirely free for gaming.

  • @garagetube2830
    @garagetube2830 4 года назад +7

    The video posted 3 minutes ago , length is 19 minutes and there is one dislike?? what kind of garbage is these people??

    • @BlueBarracuda
      @BlueBarracuda 4 года назад +3

      The people who don't actually watch the video and whole hearted believe that 2 GPU's help with video encoding/decoding. Or just people who don't like Epos' videos.

    • @EposVox
      @EposVox  4 года назад

      THANK YOU

    • @surpriseblueviana3803
      @surpriseblueviana3803 4 года назад +2

      Like or dislikes is the same to youtube..all counts us engaging so dont worry

  • @moofree
    @moofree 4 года назад +9

    Never had much luck with the AMD encoder on my VII, which usually renders gaming as a stuttery mess for some reason, but it works great with my my secondary GTX 750 via NVENC.

    • @amirpourghoureiyan1637
      @amirpourghoureiyan1637 4 года назад +2

      That card's architecture was never really designed for anything other the the data centre, the VII's good performance in games is a miracle in the context of it's weird placement in the market, especially since AMD pulled the plug on it, barely a year into its life.

    • @StephenMorrison
      @StephenMorrison 4 года назад +2

      I had an issue like that with my vega 64. I was trying to record in OBS and it was a mess. I didn't realize I had relive running at the same time. Turned it off and OBS records as smooth as butter.

    • @suhas-qw4nu
      @suhas-qw4nu 4 года назад

      @@StephenMorrison same with Nvidia's new nvenc
      The old one runs fine regardless of shadow play

    • @benjiluvsyou
      @benjiluvsyou 4 года назад

      Stephen Morrison what is relive? I have a 5600xt and the run as admin trick works but it’s still a little choppy at times depending on the game.

    • @b_ennn972
      @b_ennn972 4 года назад

      So you use a AMD GPU to play and a NVIDIA GPU to encode?

  • @andrescastro5027
    @andrescastro5027 4 года назад +31

    I can sleep in peace, by the way... I was one of the guys who asked about this topic xD

  • @Blackeye1987
    @Blackeye1987 Год назад

    i don't think people actually mean encoding on the second gpu but more scene composition on the second gpu.
    due to scenes composition beeing SUPER hard on the gpu.

  • @qfan8852
    @qfan8852 2 года назад

    Not sure what I'm doing wrong. On my RTX3090 I can only do OBS recording at 4K60fps, but there isn't enough encoder power left to also run a lower resolution live stream simultaneously. If I run both then both will become slideshow.
    Adding a second GPU does help me. Putting either streaming or recording, or both encoding streams onto the second GPU then everything's smooth.

  • @Djmaxofficial
    @Djmaxofficial 11 месяцев назад +1

    dual gpu dos work. What i notice is when i open obs it uses instantly 15% of the 3d engine of the gpu wich can make you stream lag when some game use 100% of the gpu. Connect the secong monitor on the second gpu (no need to encode). Open obs and move it on the second screen connected on the second gpu and you will notice how the 3d load in my case was 20% just to have obs open just switch on the second gpu and when you game and stream i don't loose performance and the stream is not choppy anymore wtih all settings on max on the game that i play.

  • @Iamkaden
    @Iamkaden Год назад +1

    I can confirm about 1:56, is that streaming Minecraft with Ray Tracing on a RTX 3080 (99%/100% GPU load) with OBS to Twitch WILL cause the stream to lag SO BAD, UNLESS you cap the framerate to give OBS some of the GPU usage to encode. I was thinking of getting a low profile GPU, one such as a 1650 to do dedicated encoding and take the load off of my 3080. If I ever do get one, I could upload a video to debunk this and also test Minecraft with Ray Tracing again.

    • @lellollo
      @lellollo 9 месяцев назад +1

      Subbed to your channel. Pls if you test it let me know cuz I wanted to try adding a 1650 to my rig for the same reason of yours.
      Imo this man had a bad PSU to achieve worse performance

    • @Iamkaden
      @Iamkaden 9 месяцев назад +1

      Hey there! Thanks for the sub! If I end up doing that, I will definitely upload it. @@lellollo

  • @BlueCharly
    @BlueCharly 2 года назад +2

    Okay so far, so good. But what if, i would use a capture card, catching the signal from my MAIN GPU, bringing it back to the same PC and into OBS to let it be rendered by the second GPU. To send it off to whatever streaming plattform the user is using.
    Because it would grab the incoming signal, render it and send it away.
    Sure it would have some loss in quality, depends on the capture card. But if my max output to the stream is 1080p anyway. COULD that work without significant issues?
    I hope that this makes sense somehow.

  • @Gotiansgames
    @Gotiansgames 4 года назад +6

    I've been waiting for this video. Now I can direct people here instead of tagging you on Twitter lol

  • @TechnicalGamingChannel
    @TechnicalGamingChannel 4 года назад +19

    watching this in fullscreen was a mistake
    that bluescreen got me

    • @EposVox
      @EposVox  4 года назад +4

      hahahahaha

    • @streamershaven
      @streamershaven 4 года назад

      It would have got me, if Win 10 bluescreens weren't different looking now. (Almost had me tho)

  • @Rayer24
    @Rayer24 Год назад +1

    The only reason I'm using a second gpu for encoding is because the AMD encoder sucks for streaming (recording makes no difference with a high enough bitrate). No difference in operation and the secondary gpu is only used for encoding, nothing else.

  • @glatze_-.-
    @glatze_-.- 2 года назад +1

    I can not confirm this so. I used a system where I used an RX 6800 OC for gaming and a GTX 1070 for streaming. I personally have had good experience with dual gpu streaming. Had no FPS drops and a good quality stream as opposed to 1 gpu streaming. Maybe you used the wrong settings.

  • @IntermitTech
    @IntermitTech 4 года назад +4

    First off, awesome video, thank you for putting this out there! But while I agree with the premise of the video and I understand why you explain it this way, I think it does need to be said that encoding DOES have some effect on the 3D performance of your card because although yes it's a dedicated ASIC in the GPU it does still eat from the GPU power budget and a little bit of memory bandwidth and thus 3D performance will be ever so slightly lower because of it.
    Also, you are mainly talking about live streaming gaming, in my experience, especially if you are trying to do 4K60 streaming from multiple IP cameras and such, it can help to have multiple graphics card. The same in the situation where you have a very low power GPU and for instance off-load encoding to the onboard Intel GPU can also give you just that bit of extra performance.
    So as I said, I agree with your video and I fully understand why you explained it this way, if you are trying to encode a gaming running on GPU1 and try to encode it with GPU2 it will have to copy buffer everything over both memory controllers and your PCIe links thus eating up a lot of performance and causing lots of latency, so I get you but technically it's complex. ;)

  • @ipKonfig
    @ipKonfig 2 года назад

    thing is, Nvenc on your GPU DOES actually use the GPU CUDA as we now know today. But that's not a big deal. The issue is when you're playing a game that's using 99% of your GPU and that same GPU is trying to encode for streaming - it doesn't have a lot of room and can cause the stream to stutter up. Seen it with those with this single GPU setup and I can even reproduce it. Ultimate solution is dual PC setup as we all know, but that ain't cheap :/

    • @EposVox
      @EposVox  2 года назад

      Psycho Visual Tuning uses CUDA. Some of the crazy settings in the hidden P7 preset uses tensor cores. The encoder itself left alone does NOT use CUDA, nope.

  • @randomletters1834
    @randomletters1834 Год назад

    Maybe technology has changed since the release of this video but I will say as my test I ran a 4070ti combined with a i9-13900k 64 gigs of ram playing Forza Horizon 5 in 4k at max settings I average about 120fps, when I turn OBS on record I get a warning message saying I am using too much VRAM and that I should turn my graphics down or close other programs to free up VRAM, my FPS also dropped below 50 while running OBS on record. I threw in my old GPU which is a 3060ti in my remaining PCIe slot and told OBS to use that for the video recordings, my framerates no longer dropped and I no longer had a VRAM issue on my main GPU. Maybe this theory is different when it comes to recording vs streaming. Due to my personal tests I would not say that it is 100% pointless to run a secondary GPU.

  • @DriftaBeatz
    @DriftaBeatz 4 года назад +1

    Hey Epos i'm currently running a dual pc setup with a 1800x amd rig for the encoding on x264 medium preset. Should i just stick with the dual pc capture card setup? Even when i upgrade to the new nvidia gpu's coming up? i could run nvenc new. But i'm thinking i'm better of sticking with 2 pc setup. thanks for all the work you do and info you share!!! you're a legend!!!

    • @EposVox
      @EposVox  4 года назад

      If you have it, might as well keep using it!

  • @yuinekonyan5153
    @yuinekonyan5153 4 года назад +2

    There is different scenario under which second GPU helps, but this is rarely discussed. Consider you have a capture card on the same machine you use for playing. Rather than using the Game Capture source you use your capture card. As such you don't need to enable the dual GPU option. If you play a competitive game where you want to max out your FPS, frame capping is not an option. OBS is still using your GPU for its purposes, and uncapped FPS leads to missed frames due to rendering lag. The solution is - get a second GPU. It doesn't need to support NVENC as you're not going to use its encoder. Before starting OBS, make primary monitor, the one hooked to the second GPU which will force OBS to use it. After starting OBS, switch back (make it primary) your main gaming monitor. Now your game will fully utilize your main GPU, while OBS will have its own dedicated GPU. No more missed frames due to rendering lag, no need to cap your FPS, no need to use your main GPU for OBS purposes and reduce framerates. Hope @EposVox can test this scenario as well.

    • @EposVox
      @EposVox  4 года назад

      There's virtually no reason to do that, however. Capture card always performs worse than game capture.
      So there's like 2 games on the market that block game capture and display capture is still generally a better idea

    • @yuinekonyan5153
      @yuinekonyan5153 4 года назад

      @@EposVox I would disagree. The performance difference with a capture card if any is negligible. But the whole point here is different. The second GPU is used for OBS rendering purposes which allows you to have no frame drops while keep your FPS uncapped. With XSplit you have a command line argument "mixeradapter" where you can select the actual rendering GPU. With OBS you have to temporary switch your primary monitor from Windows Settings to one hooked to your secondary GPU before you start OBS to achieve the same. I strongly recommend you to try this out. You can immediately see the results in Task Manager with Windows 10 2004 as the second GPU starts having a small constant load on it after starting OBS. You can then switch back to your primary monitor/GPU and run the game. OBS will continue running on the secondary GPU.

    • @EposVox
      @EposVox  4 года назад

      Yes half of this video involved me testing where OBS is running entirely on the second GPU

    • @madbruv
      @madbruv 2 года назад +1

      @@EposVox my 5700x running 16x and my 1660 4x would like to disagree

  • @bigdoggetom6549
    @bigdoggetom6549 2 года назад +1

    TL;DW - is this basically just "you can do it, but your GPUs both run in X8 which is worse than one in x16"?

  • @LassoKid7777
    @LassoKid7777 3 года назад +8

    I run 2 amd in pcie x16 then nvidia just for encoding streams in pcie x8 it works well

    • @ElectricBlakeGames
      @ElectricBlakeGames 3 года назад +1

      Right, this guy needs to retest with one AMD GPU and one Nvidia GPU

    • @mitsu002
      @mitsu002 2 года назад

      I have a primary 6600XT, will an Nvidia 2060 as a seconday for nvec affect performance?

  • @dest-125
    @dest-125 3 года назад +2

    I have happened to try this in my pc because my performance was suffering somehow while streaming, I had a GT 730 laying around so I put it in my pc. (Main reason was I needed VGA for a second monitor) the GT 730 was able to help me get better frames, not losing any frames either. In my case it was able to help me stream

    • @dest-125
      @dest-125 3 года назад +1

      Btw my main card is an RX 5500XT used to be an RX 580 but that was failing so I upgraded

  • @PhantomBruv
    @PhantomBruv 3 года назад

    Amazing tests, thank you so much!

  • @drakegostream
    @drakegostream 4 года назад +2

    So back in your New NVENC guide you said to turn "Psycho Visual Tuning" on. Now you're saying that it uses CUDA cores, which I guess lowers my game performance. I'm confused now, why should I turn this on then?

    • @EposVox
      @EposVox  4 года назад

      Because it improves quality. Impact on normal gaming is overall negligible in many cases, but when I'm trying to get exact performance differences purely with the difference in dual GPU, it was a variable that needed to be removed

    • @drakegostream
      @drakegostream 4 года назад

      EposVox makes sense, thank you

  • @DCrownTrading
    @DCrownTrading Год назад

    OBS GPU Settings... / 0= Auto Select GPU to use. / 1= Select GPU 1 / 2=Please force use GPU 2 to render graphics in place of 1. You might want to try again, but this time, use setting 2 so that it pics up and uses GPU 2. By using both setting 0 and or 1, OBS was still only using GPU 1. While GPU 2 was all like..."Please put me to work"....or "Let GPU-1 do all of the work while I sit here and look pretty." Hope it helps. I have been running tests on rending/streaming in 4k right now. I have 2 Nvidia Quadro M4000's and 1 Nvidia Tesla card. Am researching trying to see if I could get OBS to instead use the NVidia Tesla card to stream in place of either of the M4000's in SLI, but have not found anything yet as of the time of this message to you. Good luck. Look forward to your next clarification video. Thanks.

    • @EposVox
      @EposVox  Год назад +1

      GPUs are ordered from 0. 0 is your primary GPU, 1 is your secondary GPU. There is no setting within OBS to render on second GPU - that has to be done through the Win10/11 graphics control panel. The setting in the encoding settings you're referring to only offloads encoding (as I showed/explained) and again starts at 0, would max out at 1 if you have 2 GPUs.

  • @JavoCover
    @JavoCover 4 года назад

    I have low-spec hardware
    r5 2200G, 2x8 3200 corsair lpx, b450 pro4
    Before the death of the used "POS" RX570 I bought, I conected an old vga monitor to the IGP and my main monitor to the 570, run OBS with AMD encoder on the IGP screen and the game on the 570. I could see the preview on the IGP monitor and the streaming itself kept being low quality but at leat is was more consistent. Task manager showed encoder usage on the IGP and 3D usage on the 570. I could not keep testing as the 570 died and my game had a severe bottleneck with the CPU.
    Anyway I'm going to get my RX 5600XT by the end of the month and I'm planing on testing further the dual GPU stuff with the R5 2200G, but only for short time as I'm getting an R7 2700 on october and the performance impact won't matter anymore.

  • @LainSilkGames
    @LainSilkGames 7 месяцев назад +2

    Any update 4 years now?

  • @RetrOrigin
    @RetrOrigin 2 месяца назад

    Bandwidth shouldn't be a problem these days with chipsets such as x670e that can keep the primary PCIe slot running at 16x regardless of what you connect to the other slots since the primary slot is usually handled by the CPU and the other slots by the chipset.

  • @Saturn2888
    @Saturn2888 Год назад

    In your AV1 Q&A livestream, you said something about the A380 dual-GPU making AV1 viable? Do you mean only if you have two 8-lane cards or with newer motherboards and the 8-lane GPU, it might finally be viable? I'm pretty sure rebar should also help.

  • @MG-ff7se
    @MG-ff7se 3 года назад +1

    True you lose some-what of performance because you are technically splitting the bandwidth between the two cards. However, for recording or streaming using the ncode from the second card does help the load off the main card. Remember the difference between the two cards has a major role to play like a 2080 paired with a 980ti. I use a 2080ti with a 2080 as my recording/streaming and I suffer no performance issue. Sure I lost bandwidth on my card but I gained a recording/streaming pc now. Remember for this to actually work you have to change the card in the settings to the correct gpu you want to use as the new encoder

  • @frostmedia
    @frostmedia 4 года назад

    i use the iGPU from the CPU for OBS and it dose not affect negativ in my tests the framerate. OBS alerts me with an overload (on my settings) if i play a game, take a record and let a youtube video run, on my main GPU. But if i use the iGPU, i have the same frame drops and not the overload alert. Anyway your video makes my eyes open and let me think abbout it, thanks!

  • @arbelarad1980
    @arbelarad1980 2 года назад

    i don't think these issues are still applicable today.
    i have been using dual gpu for some time now with streamFX NVENC h.264 for stream and h.265 for recording and with the main OBS process on the primary gpu.
    this deffinatly works on the 2nd gpu as my new main is an amd card.
    i have been using the max quality presets for stream+recording at 1440P on a 1660ti and it will almost max the card out.
    this is NOT using the slow SLI support mode.

  • @BarnardoPlays
    @BarnardoPlays Год назад

    I would assume this still holds true for AV1. Since Intel Arc A380 cards were released (tentatively, not actually manifest) I'd seen them suggesting Intel cards as a cheap addition to add "all the advantages of" AV1 encoding to a system. Including at least one of the better known RUclipsrs. With no information about how to set that up practically, nor information about how that actually performs in practice.
    I figure all the issues would remain and it still wouldn't be advised. But with nVidia and AMD releasing their flagship cards first with AV1 encoders from anyone not Intel being in the over $1k price range for the time being, I expect pressure for this logic to increase. Maybe it's worth revisiting to have an updated fact check on the suggestion?

  • @davidjdunham
    @davidjdunham Год назад

    i use 2 gpus because i have an rx6400 and rx 6500, which both don't have encoders on them. so it does really help me out when gaming or streaming.

  • @heyguyslolGAMING
    @heyguyslolGAMING 4 года назад

    This was something I was considering but never went forward to actually test, so thank you very much for doing this. The problem I've had is I want to stream at 720p60 and record at 1080p60 but I can't use the Nvenc (new) encoder to stream and record because the (new) encoder doesn't allow for different Output Scaled Resolutions. I tried using Nvenc (new) and old to do this and the recording was a stuttery mess. So the only options I have is to use x264 for streaming and Nvenc (new) for recording and rescale the stream output with x264 in the Output/Stream settings.
    If I switch to Nvenc (new) stream and x264 record isn't an option either and tbh if I have to use x264 I would rather use that for streaming over Nvenc, if I'm using a camera because I've tested and found that the Nvenc (new) or old causes some serious artifacts with cameras regardless of scale, fps, camera quality (webcam or DSLR) Nvenc just sucks for stream cameras.
    Optimum Tech did a video on this a while back but his issue was camera stuttering not artifacts. So in short if I want to stream and record using Nvenc (new) I have to do it at 1080p60 which I completely agree is NOT a good idea as I completely agree w/ you that 720p60 is the ideal stream quality setting for most gamers on Twitch. The hardware I'm using for streaming is a 2070 Super and 2950x; streaming w/ x264 scaled down to 720p60 and recording Nvenc (new) 1080p60. I wish there was a way to do it all off the gpu :( but the scaling issues and camera artifacting just don't work.
    Btw I also tested using NDI and sending the output to another pc and recording that way but then there was such a quality loss it made it pointless to use.

    • @EposVox
      @EposVox  4 года назад

      For what it's worth I've never once encountered the camera issue he did and I've never heard of anyone else doing so either :P

    • @heyguyslolGAMING
      @heyguyslolGAMING 4 года назад

      @@EposVox I'm in the middle of a move so I can't exactly shoot you a test vid to show you but I will do so at some point. I've talked to a few others as well so I'm not the only person who has experienced this.

  • @nxpstergaming4924
    @nxpstergaming4924 3 года назад

    Well I do dual GPU and have a 30 frame difference... but you're right.. it doesn't make sense since you have "dedicated hardware" don't know why it works but it works.... I have a 3060ti running gamaes and a gtx 1060 running my encoding

  • @et0h7an
    @et0h7an 4 года назад +1

    I didnt actually hear a question, but... why exactly is it pointless to use a second gpu for encoding if there is hardware specifically used for encoding? And whats the conclusion other than dont use dual gpus, is a single gpu sufficient for streaming or do you need a dual pc?

    • @EposVox
      @EposVox  4 года назад

      Because that hardware is already on your first gpu and the process of transferring everything to your second GPU is far more costly than just encoding on the first one, as the entire video sets out to prove

  • @iamalthepal
    @iamalthepal 4 года назад +1

    @16:45 will they fix this issue in future updates? I happen to notice some game devs started to incorporate VULKAN like Rainbow Six Siege before starting a game. It gives you the option of starting in direct x or vulkan. My question is will OBS work with VULKAN in that problematic issue you came across.....maybe try to replicate it using that game engine. Just a thought...

    • @EposVox
      @EposVox  4 года назад +1

      The way it was explained to me by the OBS devs was that was inherent to how DX12 works and hooking it. There's nothing to fix, it just can't be done. It can *capture* the game, it just can't share that capture across two GPUs. I think this is a different issue than what you're talking about.

  • @RentaEric
    @RentaEric 8 месяцев назад

    I am currently running a Intel arc a770 in my pcie 4.0 x 4 slot, next to my main GPU 4070 super. Arc card uses 0 watts, and runs at 50% when recording at high visual settings, 90% when recording and streaming, allows my 4070 Super to run at similar and higher performance.

  • @Wolfox360
    @Wolfox360 Год назад +2

    You have to use the second card ONLY for encoding, you should use a capture card to get the output of the main card and send to the system, then OBS will use the second card only for encoding. The Main system used for playing will not need to send back a signal to the second card. This way you don't even need a second monitor.

    • @SonOfWalhall
      @SonOfWalhall Год назад

      So from GPU #1, into a Capturecard in the same PC?
      The setup is getting out of hand 😅

  • @gremlen
    @gremlen 3 года назад

    Great video, thanks! Would it be worth testing this concept using an egpu with a 1650 super? I’m guessing it would be basically the same. What do you think?

  • @TermsAndConditionss
    @TermsAndConditionss 4 года назад

    Your videos always remind me why I subbed. Great video. Thanks

  • @Tarulia
    @Tarulia 4 года назад +5

    I may have misunderstood you at 05:50, but PCIe 4 actually has no impact on this test at all. 8 lanes is 8 lanes. If the board is only wired for 8 lanes you don’t magically get PCIe 3.0 x16 speeds on those 4.0 x8 lanes. The theoretical bandwidth of the slot is doubled, but that also requires 4.0 on both ends. In other words: if you put a 3.0 card in the 4.0 x8 slot, you effectively get 3.0 x8 lanes and their bandwidth. Hence, no impact of PCIe 4.0 whatsoever in this test. Would be different if those boards were splitting PCIe 4.0 x16 into 2x PCIe 3.0 x16, which they are not (at least I know of none, but it would be cool).

    • @EposVox
      @EposVox  4 года назад +3

      Theoretical possibly less bottleneck of overall bandwidth to CPU and back to second GPU and to anything through the chipset. Didn't imply per-GPU bandwidth would change. But the memory channel/speed + faster overall throughput makes a difference, even if minimal - we're already talking minimal differences at this point for this kind of video anyway

    • @p51mustang24
      @p51mustang24 4 года назад

      You still have to ask the question: do most GPU's use the full speed of x16, whether 3.0 OR 4.0? Most don't come close afaik. Usually the new standards come out years in advance of GPU's saturating the available bandwidth, no?

    • @MFMArt
      @MFMArt 4 года назад

      p51mustang24 agreed, gamers nexus i believe ran a test on this there is no degradation on performance using a gpu in x8 mode.

  • @malcowicz
    @malcowicz 4 года назад

    Great research! Good job man!

  • @flixilux
    @flixilux Год назад +1

    This video is old and I just came across it but i still want to share my thoughts.
    Everything you talked about only applies to people with Nvidia GPUs. This might be the majority, but it still makes you sound ignorant.
    Here's the problem. AMD with its unified cores does not have "ASICs" for encoding. The cores that render, also do compute and encode. This is terrible if you have a low-mid tier AMD gpu.
    Case and point: I run a RX 5600XT which in rasterized performance is on par or better than a RTX 2060. It runs even newer games just fine and dandy. In AMD´s "Adrenaline" driving software, which is the equivalent to Nvidia's GeforceExperience, you have a "InstantReplay" function, functionally identical to Shadowplay. Having this activated and running games on unlocked framerates not only hurts the performance of the game running, but also makes the recordings turn out choppy and stuttery.
    Besides that, the decoder also seems to have it's issues. Running RUclips videos at 1080p60 while playing a game makes it drop frames or become entirely unresponsive. Turn off Hardware Acceleration and you see immediate better Framerates in games while also having youtube open in the backround, and no more skipped frames.
    "But How can I stream or record my gameplay without suffering terrible lag and skipped frames?"
    Get a second GPU. Be it something lie a 750ti if you only stream 1080p60 or go for something like a used Quadro or Tesla on ebay if you want something more fun and to experiment with .
    Not only will these help you encode video (as in Stream or record), they also help you with sutff like Hardware Acceleration like most Desktop UIs or Watching Twitch streams or TY vids, while you can have your other GPU focus on rendering the game. It also gives you access to Nvidia specific features like CUDA , or in newer cards tensor cores. If you're a nerd and love messing around with UVR5 or Image upscaling using waifu2x or topaz ai, these things are more feasible with an nvidia gpu.

    • @EposVox
      @EposVox  Год назад

      AMD GPUs absolutely have dedicated hardware for encoding. You are incorrect.

    • @bobbybamf14
      @bobbybamf14 Год назад

      ​@EposVox isn't the nvidia encoding better for streaming to twitch?

  • @ArgadashHD
    @ArgadashHD 4 года назад

    Haven't watched the video yet and don't have the time right now but have you tried this: Disable the gaming gpu in device manager, then launch obs and activate the gaming gpu again. Shouldn't be a big performance hit, even tho both gpus will only run at 8 pcie lanes instewd of 16. Another option would be activating the integrated gpu on a intel chip and setting obs to start from this and then use a capture card to capture the image. Allinall still not optimal but possible.

    • @EposVox
      @EposVox  4 года назад +3

      ...yes I know HOW to do it. The point of the video is that it hurts performance and backing it up with data.

  • @lyzrchllyhvd
    @lyzrchllyhvd 4 года назад +1

    How about all of us VFX artists with 4+ GPUs for rendering? One of the best parts of GPU rendering is that you can go play a game on one GPU, while you render on the others. Also, you never addressed used a secondary GPU just for the compositing. Using Windows Settings Graphics Options I conformed OBS to work on one of my other GPUs, but still use nvenc on the one I game on.

    • @lyzrchllyhvd
      @lyzrchllyhvd 4 года назад

      Also yes, new nvenc is on (I checked the log) and the host platform is epyc rome (over clocked with p states) which has the pcie lanes and memory bandwidth.

  • @Toastfalter
    @Toastfalter 4 года назад

    To me it sounds a bit strange.
    I use 6 monitors on my PC (somehow one always came along when upgrading).
    I have installed an RTX 2070 and a GTX 1060 with me.
    The lower 3 monitors are on the 2070 and the upper ones on the 1060. OBS runs on one of the upper monitors.
    I didn't need to change anything in OBS and I didn't set an SLI there either. I simply tell OBS for desktop recordings the monitor number as Windows numbered it.
    I play on a lower monitor and thus yes about the 2070, which OBS recognizes without problems in game capture.
    However, the log file says: Loading up D3D11 on adapter NVIDIA GeForce RTX 2070 (0).
    In the video, however, the 2nd graphics card is mentioned there.
    Since the 25th version of OBS, only the NewNvenc is available.
    Since OBS apparently uses the 2070, I think he works with the NewNvenc.
    I have OBS running on a monitor that is connected to the 1060.
    It may be that I have lost FPS as a result, but I do not notice it, because everything is fixed to 60 FPS anyway due to taking.
    In OBS I set up GPU1 with NewNvenc and CQP 18.
    The video is great, no question (like all videos) !!!
    Somehow I did not have the problem that he does OBS on the second graphics card.
    I see a difference in the Task Manager performance window between recording and not recording on the 2nd graphics card.
    Strange...
    (i used google translate from german to english, the sentence structure could be confusing)

  • @Chozo4
    @Chozo4 4 года назад

    So - I noticed this issue a while ago myself, a 10% loss in FPS with a second GPU. I **THINK** I've figured out the issue. It lies in something to with the WDDM driver itself and multi-gpu's in question. Adding a second GPU for instance (Quadro K620) made my GTX Titan X go from a heaven bench of ~2500 to ~2250 and it was hard to try to get it back up. Simply even having drivers loaded for the second GPU caused the drop whereas disabling the 2nd GPU fixed the issue oddly. After some research providing it is used strictly for compute use such as stream encoding you can set the second GPU to "TCC MODE" rather than the default "WDDM MODE" through the NVSMI application included with the nvidia drivers (link provided). After a simple reboot the GPU can only be used for those purposes but it no longer incurs that inherent 10% performance penalty. My benchmark scores went right back to where they were before adding the second GPU and 2nd GPU encoding worked just as well as it is - possibly slightly better due to less driver overhead and memory fragmentation inherent with the WDDM mode. Could you check and validate my findings to see if you experience the same result in fixing the FPS issue?
    docs.nvidia.com/gameworks/content/developertools/desktop/nsight/tesla_compute_cluster.htm
    I would have to argue that I did have a benefit to using a second GPU even before. I always tax my primary GPU and without a second GPU to offload it my stream ended up very laggy and choppy when using NVENC on the same GPU and encoding on the second GPU made it smooth and lagless. However by changing the second GPU to TCC mode ended up fixing the side-effects of the second GPU.

    • @DjScribblez
      @DjScribblez 2 года назад

      Any update on if this is still true today?

  • @tuobraun
    @tuobraun 4 года назад

    Amazing video. ❤️ Thank you for such a detailed test.

    • @EposVox
      @EposVox  4 года назад

      Thanks for watching!

  • @DFiantFigure
    @DFiantFigure Год назад

    Biggest thing that stood out too me was how much more power you have to draw off your psu

  • @Blahdramagaming
    @Blahdramagaming 4 года назад +1

    Damn that’s a real shame, would’ve loved to just throw in my 580 and use it to record. Glad I was informed though. Thank you.

    • @techgoggles
      @techgoggles 4 года назад

      amd encoders are shit! i have vega 64 game looks shit!

    • @p51mustang24
      @p51mustang24 4 года назад

      @@techgoggles maybe he's got an old gtx 580 lying around

    • @rockrl98
      @rockrl98 3 года назад

      @@p51mustang24 NVENC first appears in Kepler GPU's...

  • @Anthony-qj3gd
    @Anthony-qj3gd 2 года назад

    @EposVox so does the fps drop in general when you put a second gpu in or is it when you try to stream? cause Im trying to do a dual monitor setup and my Cpu got no internal Gpu and my GTX 1070 got only one hdmi, I could get an adapter but I was leaning towards second Gpu. Lmk

  • @shawns9070
    @shawns9070 4 года назад

    Hi EposVox Thanks for the video I think this was very much worth you making but I have 2 points that I think are worth discussing.
    I have a gaming laptop with an Intel CPU and so an IGPU as well as an Nvidia 1060 Max Q GPU. I have at times had issues streaming using this machine with some games. With these laptops, they are designed that you are meant to use the IGPU for everyday tasks and the Nvidia one would take over when more demanding tasks happen (games etc). I have with some games has huge problems streaming with this setup as some games seem to still want to use the IGPU unless I force it to use the GPU which does not always work. I have has times where I would also only get black screens as you experienced too no matter what capture type I used with the exception that sometimes window capture would work. Interestingly the XSplit options (I have tried them all) all have no issues with these games as I found mostly they offer better FPS in-game too for these games at least.
    It might be interesting to see if using an Intel or AMD IGPU/GPU with an Nvidia one would work any different. That said I am thinking it probably would end up with a lot more issues.
    I am now using a 2 computer setup using NDI and have had some encoding issues missing or skipping frames on the sending computer do you know if running OBS as Administrator will help that and should I do it on both computers?

  • @philliplucia2523
    @philliplucia2523 Год назад +1

    i dont know what hes talking about when i hit record my 3d gpu usage is at 50% and even jumps to 80

  • @risingSilence
    @risingSilence 4 года назад

    @EposVox: great video as always. is there a way to change the rescaling behaviour when using "rescale output"? It looks ugly, not as bad as nearest neighbor but close. there must be a way to utilize the custom x264 commands but setting filters there does obviously not work. why dont they include an option for that? I know its not ideal and done by the CPU but there must be an elegant way to record native 1440p and stream 720p without that nearest neighboresque edges everywhere :\

    • @EposVox
      @EposVox  4 года назад

      They do include scaling filters. Those filters do work

    • @risingSilence
      @risingSilence 4 года назад

      @@EposVox vf scale? How do I have to format those?

    • @risingSilence
      @risingSilence 4 года назад

      sws_flags doesnt work for me no matter how i format it. sws_flags=bicubic or sws_flags="bicubic" nothing works :\ you might know how to format it properly? or is it another command?

    • @risingSilence
      @risingSilence 4 года назад

      obsproject.com/forum/threads/which-resolution-scaler-should-i-use.47317/#post-211823
      "One problem with FFmpeg and OBS-Studio (output setting), is OBS-Studio is hard coded to use a poor quality fast_bilinear with no option to change scaling to regular bilinear, bicubic or lanczos, for users of fast CPU and slow GPU."
      is that still the case? theres gotta be a way to use something else other than fast_bilinear...

    • @risingSilence
      @risingSilence 4 года назад

      @@EposVox OBS log states: "[x264 encoder: 'streaming_h264'] x264 param: sws_flags=bicubic failed"

  • @Rov-Nihil
    @Rov-Nihil 2 года назад

    I'm trying out having a AMD gpu as main (cuz AMF sucks) and a NV one as encoder just to use NVENC on the same PC so I wouldn't need a second PC. Another thing I've noticed when using two GPUs is that the performance differs depending on which GPU you attach the screen to, it can go from drawing the load on one or the other, to drawing from both! So there is some level of asynchronous computing done, but i'm sad that we haven't reached that stage where games would benefit from both (IM LOOKING AT YOU, DX12 aka Vulkan!)

  • @markmozza
    @markmozza 3 года назад

    I keep hearing that using nvenc has 0 affect of GPU performance or doesnt impact the GPU in anyway as its seperate chip, however, when i run my tests i always find my GPU usage to jump up 10-15% when using the encoder. Its been like this ever since i can remember.

    • @EposVox
      @EposVox  3 года назад +1

      If you’re just looking at the task manager summary view, that’s because that combines 3D loads and encoding loads into one summary and isn’t accurate. There’s a bunch of different parts of the GPU to track performance and usage on, and the encoder is it’s own hardware that does not utilize the 3D hardware used for games.
      OBS itself does to composite and render the scene, however

    • @markmozza
      @markmozza 3 года назад

      @@EposVox awesome, thanks for the reply. Makes perfect sense

  • @JickFincter
    @JickFincter 4 года назад

    I run a second lower tier gpu just to run my extra monitors since I don't have onboard graphics drivers and I noticed that using the second gpu yields much worse results then just using your primary GPU, that is the whole reason I looked this video up. My secondary GPU is literally only useful for running rtx voice or helping train AI I cannot find another use case.

  • @sku56
    @sku56 4 года назад +3

    rip, I guess I'll just hold on to my extra gtx 1070 ...

  • @michaelmedicworldoftanks33fps
    @michaelmedicworldoftanks33fps 3 года назад

    I have a question my intel laptop has i5-7200U and dedicated geforce MX110 and the game normally runs on mx110 but the screen and OBS are normally runing on intel HD graphics . My question is is this the same case as you are expaining in this video and is it bad for recording video? Should I run OBS studio recording on geforce mx110 and use SLI mode in obs for recording, would this give much better recording for my case???

  • @kordakin
    @kordakin 3 года назад

    for me it works better with 2 GPUs, i guess it depends on the setup, btw theres a setting in OBS where you can select which GPU to use for streaming, GPU- 0 for secondary, 1 - for primary, depending which one you want to use for streaming. for me its significant boost with 2 GPUs, some games i couldnt even stream with 1 GPU, and with 2 no problem

    • @EposVox
      @EposVox  3 года назад

      I talk about the setting in the video

    • @NuSpirit_
      @NuSpirit_ 3 года назад

      Yeah same things is happening to me - some games just nail the 1070 I have and have quite a lot of frame drops/lag and I'm thinking about buying P400 or T400 (combined with 2700x and quad RAM setup). So I guess it's 50:50 it'll work or not.

  • @alisonmarshall2596
    @alisonmarshall2596 4 года назад

    I have to say I have used a 2nd gpu as an encoder but with AMD cards and on older hardware so my experiences are different to this video. The reason I had was my FX 8350 died, resulting in me having to go back to my phenom II 1100t as at the time I couldn't afford the upgrade I was wanting to make (a 2600, which would me new ram and mobo too), the 1100t could not cope with streaming or recording, my RX 580 wasn't doing brilliant either. I looked at my options and thought i'd try my old R7 360 to encode with and it actually worked and gave me better quality recording and streaming than what I was getting, unfortunately my cpu was really starting to struggle with games. This meant I ended up overclocking (to just squeeze so extra performance out) which then led to not having enough power to run everything, so ended up using reduced encoder settings on just the RX580.
    Obviously what I did was a like worst case scenario and was done as a temporary measure and maybe I got lucky with my results or amd works differently to nvidia, so would be nice if you would be able to test older hardware as really with newer hardware I see no reason to do it at all, but as was with my case, older hardware might benefit from it or at least give a temporary work around for as unlucky as I was.
    Edit: I also will add both Graphics cards were running on PCIE 2.0 at x16

  • @berryvr7184
    @berryvr7184 4 года назад +2

    soo, what if u have two graphics cards and a streaming card/ recording card. wouldn't that solve anything?

    • @EposVox
      @EposVox  4 года назад +3

      No? That would further split up your pcie lanes and etc even more. I showed you lose framerate just by having second card in, a third card is only going to compound that issue

  • @MichaelW1980randoms
    @MichaelW1980randoms 4 года назад

    I have two more questions:
    1) Is the performance issue caused by the main graphics card being always the one used for rendering?
    2) Does setting a framerate Limit in the Driver help with keeping the GPU usage in check to keep rendering lag low?

  • @Phoenykz
    @Phoenykz 4 года назад +3

    This might sound dumb, but would it make any difference if you had the game screen output to a capture card on the same system and force OBS to use the second GPU to encode. I know it’s an overkill but would make any difference compared to this setup?

  • @ZaneDragonBorn
    @ZaneDragonBorn Год назад

    My issue currently is my Medal client keeps having to revert to CPU encoding because my GPU usage is up to 99%. If this encoding did run on separate hardware, why does it suffer when the rest of the GPU is used? Confuses me

  • @HowdyFolksGaming
    @HowdyFolksGaming 2 года назад

    Might I ask how exactly did you go about getting OBS to run exclusively on the second GPU? I have a dedicated capture PC, and I’m attempting to run two instances of OBS, each running on a separate card. However, even though I can make one render using the second card’s NVENC encoder, it always hammers the first card with other compute tasks as well, to the point that it hits 100% and I lose frames on each card. Meanwhile the second card is ONLY running NVENC. I cannot figure out a way around this. Running one of the OBS instances exclusively on the second card would help I think.

  • @gregrichardson1262
    @gregrichardson1262 Год назад

    So I do have a argument that is completely missed so yes using sli to encode using 2 monitors is going to ruin performance which is why you run two screens from one card and have the other gpu as a stand alone encoder the 2nd gpu does not need a video feed out to a screen to function as an encoder

    • @EposVox
      @EposVox  Год назад

      Second monitor being on one gpu or the other doesn’t impact performance.

    • @gregrichardson1262
      @gregrichardson1262 Год назад

      Wow you stated in your video as much

  • @Optive
    @Optive 4 года назад

    I CAN'T WAIT till support for downscaling is built on the new encoder, don't gpus scale displays all the time? (if you want to record 1080p and stream 720p you have to use the older encoder which taxes the cpu a bit instead of not at all)

    • @mokad
      @mokad 4 года назад

      Really? So I've been using nvenc old all this time since I've down scaled my stream in the encoder tab in obs?

    • @Optive
      @Optive 4 года назад

      @@mokad yep :/

  • @FlammableAirFreshener
    @FlammableAirFreshener 4 года назад

    thank you for doing this - I had looked into it with the power of google and found a whole bunch of "no, don't do this" but all the information I found was between 5-7 years old. With no information on the newer GPU lineups I had been wondering if it could be worth it in some use case scenarios
    With games that will eventually fall out of delay hell like cyberpunk those of us on single PC streaming setups (or at least just me) could be wondering what these games can run like while streaming. Personally I ended up just upgrading to a 2070 super and slapping it with my best wishes.
    I'm thinking there may just be a point where you need to decide where your stream is, and if upgrading the single PC isn't worth it, just get some used hardware and slap something like the 1650 inside - that being said, in the world of 1080P it doesn't seem like people will be getting there anytime soon.
    From what I understand - and something mentioned at the end of the video the whole reason it won't work is because there just aren't enough PCIe lanes to use, so maybe in the future motherboards may pick up to a point where something like this is an option. But I also imagine by the time that happens even if we end up with huge leaps in motherboards (and I doubt we will at the consumer end any time soon) we will just find ourselves in the same situation, just with higher resolutions, or something like ray tracing being our base norm.

  • @VikingDudee
    @VikingDudee 4 года назад

    I love OBS but hate it at the same time, But you'd be better off using OBS and NDI than a dual GPU system. I mean an older i5 2400 system with a 780ti for nvenc encoding worked great for me. I just still hate that OBS kills performance somewhat, and that you pretty much have to lock your FPS or OBS kinda doesn't like it as things are fighting for GPU recourses. I typiclly just use AMD's Relive on my 5700xt and just deal with the occasional weird or ugly picture.

  • @PixelShaderPl
    @PixelShaderPl 4 года назад +5

    Hello, it always seemed to me that streaming with a second GPU in OBS was enough to change in the tab streaming -> GPU 0 on GPU 1.
    I never thought about some SLI functions since we have two other graphics card models, now I noticed that loading the second graphics card also affects the GPU load of the first one, I did not know that NVENC (new) changes to the old one, that would explain the transfer of data between the first and second gpu. I thought that using 2 GPUs will relieve GPU 1 and help reduce micro cuts and overloading of 1 GPU with the game, I only have GTX 980 and Quadro P400. My motherboard supports PCI-E 3.0 x16 (8 + 8) or 3.0 x 16 + 2.0 x4. I performed the Quadro P400 benchmark and the result was almost the same in points for PCI-E 3.0 x16 and PCI-E 2.0 x 4, probably for a more powerful card it would matter.