Stable Diffusion: The Ultimate GPU Guide

Поделиться
HTML-код
  • Опубликовано: 19 окт 2022
  • Many members of the Stable Diffusion community have questions about GPUs, questions like which is better, AMD vs Nvidia? How much RAM do I need to run Stable Diffusion? Should I rent a GPU or buy one?
    PeePa, CeaselessVibing and myself created a guide to answer some of these questions (accessible here: docs.google.com/document/d/1l...) and this video is mostly a walkthrough of that guide. Absolute gun work guys!
    Discord: / discord
    ------- Links -------
    No Man's Guide: docs.google.com/document/d/1l...
    ------- Music -------
    Music from freetousemusic.com
    ‘Snow’ by LuKremBo: • lukrembo - snow (royal...
    ‘Affogato’ by LuKremBo: • lukrembo - affogato (r...
    ‘Animal Friends’ by LuKremBo: • lukrembo - animal frie...
    ‘Biscuit’ by ‘LuKremBo’: • (no copyright music) l...
    ‘Branch’ by ‘LuKremBo’: • (no copyright music) c...
    ‘Butter’ by LuKremBo: • lukrembo - butter (roy...
    ‘Onion’ by LuKremBo: • (no copyright music) l...
    Many thanks to LuKremBo
    #stablediffusion #aiart #tutorials #techtutorials #gpu #nvidia
  • НаукаНаука

Комментарии • 199

  • @pokepress
    @pokepress Год назад +21

    One other technical consideration-if you’re going to be batch processing files relatively quickly (mass upscaling, processing video frames as individual files, etc.), disk I/O speed can become a significant factor. As such, an SSD can be really helpful in those cases.

    • @ThatGuyFromDetroit
      @ThatGuyFromDetroit 11 месяцев назад

      yes, or even if you just switch checkpoint files often enough to notice the loading time, and you'll especially notice the benefit of an SSD once you've got a single folder trying to load a few thousand images all at once with 'sort by: date' enabled

    • @shagunkumar643
      @shagunkumar643 5 месяцев назад +1

      Please let me know if stable diffusion works on amd graphics card or not

  • @wy4553
    @wy4553 Год назад +3

    Wow, thank you so much for this detailed video!! This was super helpful. Really appreciate the Google doc guide too 🙏🙏

  • @Syncopator
    @Syncopator 6 месяцев назад +3

    What nobody seems to talk about when considering one of these relatively new GPUs, is what MOTHERBOARD and CASE features do you need to plug them in? Slot size? Slot technology? Power requirements? Fans? There are tons of videos about choosing a GPU but you need to know that it'll work when you plug it into your desktop server or game system or whatever you're running.

  • @AlexUnder_BR
    @AlexUnder_BR 9 месяцев назад +1

    You deserve to be bigger here buddy.. very explanatory, very concise and beginner friendly way to explain complex stuff.
    Success for ya!

  • @temporallabsol9531
    @temporallabsol9531 Год назад +8

    1. Fully just use colab if you want to play around. Don't expect amazing stuff.
    2. Pay someone if you just want a quick and experienced fullbuild model and don't want to learn how to run on a local or cloud system.
    3. Go full 🤓 - I suggest this one.

    • @lewingtonn
      @lewingtonn  Год назад +1

      yeah that's very true, there's probably someone' nephew you can go to for an easy install

  • @VimCommando
    @VimCommando Год назад +30

    Many people overlook the RTX A4000. It is a professional card and the gaming performance is roughly an RTX 3070. The A4000 has lower clock speeds with higher power efficiency, more CUDA cores and 16 GB VRAM. The low power usage combined with 16 GB buffer makes it a very attractive option for running stable diffusion locally.
    Plus thanks to Etherium miners dumping these cards they only run about $100 more than the RTX 3070 which only has 8 GB VRAM.

    • @VimCommando
      @VimCommando 10 месяцев назад +1

      @@thesomewhatfantasticmrfox yes it does

    • @mycelia_ow
      @mycelia_ow 6 месяцев назад

      I'd rather get a 3090 since I'd need the full 24GB

  • @laslo67
    @laslo67 Год назад

    many thanks for this. answered many questions I had.

  • @PianoCinematix
    @PianoCinematix Год назад +40

    I have upgraded from my 3080 TI to the 4090 for 3D rendering however, the power increase is quite good especially when training with dreambooth ( using xformers with windows wsl). On the 3080 TI it would take around 10/11 mins for a 1000 steps. The 4090, can do 1500 steps in 10 mins soo that's a good 50 percent increase using the same settings. Hope this well help somebody if they're thing about upgrading to a 4090 and the time frame of training. The one thing I will say is at "512x512" image generation using (automatic1111) the 4090 and the 3080 TI took exact the same in time to generate. the only difference is when I increase the resolution for example 1024x726 then the difference becomes apparent.

    • @lewingtonn
      @lewingtonn  Год назад +6

      that's kind of weird and disappointing... the 4090 is supposed to be a BEAST

    • @PianoCinematix
      @PianoCinematix Год назад +1

      ​@@lewingtonn exactly, I think the biggest miss conception is most assume the 2/4x rasterized performance which isn't possible. you can however get that 2/4x performance with DLSS 3.0 with frame generation however, from a few big tech names online, one i believe was gamers nexus said "the maximum increase you will get from last gen 3090 TI is up to 70 percent increase rasterizing processing on a 4090". Least that's what they found in their testing's. Kind of correlates to the performance numbers I posted when training with DB . I personally don't mind though . the fact that the 3080 TI could render them "512x512" images in less than 3 seconds is incredible. If there is a difference between the two it's soo small of a time frame I couldn't tell the difference.

    • @lewingtonn
      @lewingtonn  Год назад +2

      yeah, kind of mad how fast generation is. If you haven't already installed xformers, I would thoroughly recommend doing so. It sped my GPU up by 1.5x at least.

    • @ceaselessvibing5997
      @ceaselessvibing5997 Год назад +2

      Run with --no-half
      4090 not optimised for 32bit float yet

    • @PianoCinematix
      @PianoCinematix Год назад

      @@ceaselessvibing5997 that would make soo much sense! thank you

  • @Gekko008
    @Gekko008 Год назад +2

    7 months in, with the new 40x series being released, your recommendation still is valid, I think. I've been looking closely at the GPU market as well and with how Nvidia has been pricing their new cards, the performance to cost doesn't make as much sense.
    Two things that I think is worth adding about what you said
    1) I didn't know as much about Python or Pods at the time I started, so I locally installed Automatic1111 on my laptop. It was hard enough to try to learn all the things about SD, that I didn't want to also learn Python for it. One of the cons I think it's fair for you to mention is "for some people, learning Python can be a learning curve".
    2) I really liked your way of saying "no matter what the speed of your GPU, you kind of have to walk away from it". That is, I think there's a strong idea of "I want it now", but even with my laptop, I just set it and walk away (cook, play other games on my tablet etc). It kind of reminds me of my days of taking photo on Film, and you couldn't see your actual work until development was over. I somehow feel like people aren't used to this idea anymore.
    Final note, I am so happy to hear that my overall analysis of the GPU lined up with your recomm. The Vram availability, cost etc was the reason why I was looking at it and it seems to be exactly the same conclusion you came to.

  • @mamamurphy3860
    @mamamurphy3860 Год назад +2

    Regarding that last thing you said. I don't really want to generate these grand images I sorta Want to use SD to come up with ideas for my own paintings or img2img fill my doodles to document ideas so I'd rather generate a bunch of 512x512 images in bulk with like 30 samples. Would you still recommend the 3060 over 3070 for something like that?

    • @lewingtonn
      @lewingtonn  Год назад +3

      If cost is no option definitely go for the 3070 since it WILL be faster, but it's really a cost vs speed tradeoff.

  • @AmirZaimMohdZaini
    @AmirZaimMohdZaini Год назад +2

    Mining GPU cards can be also used for stable diffusion. However, it is only works best on Linux since they have latest updated drivers that allow to use mining GPUs on stable diffusion and any other non-CUDA apps. But the PCIe being 1.1 1x only is limiting hardware capabilities for other purposes.

  • @macguy9152
    @macguy9152 Год назад +2

    Thanks so much for an awesome video! After watching it I've been playing around with collab for a few days. I understand the complete cloud model, and understand the local model. In your video you seem to indicate a hybrid model for those that don't have modern GPUs (I'm a mac user and have a windows VM).
    Any tips on how to setup such an environment? i.e. what SW do I need to install on the VM and where do I configure it to use the rented GPU?

  • @wagmi614
    @wagmi614 Год назад

    do you know if there's some way to use 2 GPU on a single image generation job to make it much more faster?

  • @markdavidalcampado7784
    @markdavidalcampado7784 Год назад

    have you tried at least 2 gpus in one pc? for better rendering in dreambooth

  • @60tiantian
    @60tiantian Год назад

    that was good advice ,thanks bro

  • @AI.ImaGen
    @AI.ImaGen Год назад

    Yep! MSI RTX 3060 Ventus 2X 12G OC is a perfect value/perfs. Need VRAM to stock textures, I can render directly full HD images without Hi-res option. Of course, it's not the right way to use S.D (AI grid: 512x512 for S.D 15.5 - 768x768 for S.D 2.1.768) but it works well and relativly fast for some landscapes scenes. And I use it with an very old HP Z600... from 2009. Gpu alim is given for max 75W but in reel, it's 200W. Need just to buy 6-8 pins adapter. The RTX 3060 need 170 W (167W max in HWINFO64).

  • @Comic_Book_Creator
    @Comic_Book_Creator Год назад

    can you tell me please, if I buy google colab.. then I can save my files on my notebook? does it gives me also space?

  • @ZeroIQ2
    @ZeroIQ2 Год назад +4

    I have an RTX 3060 and for reasons I will go into, it Stable Diffusion was really slow for me ... but because I messed something up lol.
    I just assumed Stable Diffusion was always slow (like 10 minutes to render an image) so I thought, well it's early days things will get faster in the future, then I was reading a post and it talked about checking if pytorch is actually setup correctly, the following command checks if pytorch can see CUDA:
    python -c "import torch;print(torch.cuda.is_available())"
    I ran that and it returned "false", which was both really good and bad at the same time, good that it meant the reason Stable Diffusion was only slow for me was because it was using my CPU and not my GPU, but bad because now I had to work out how to fix that.
    So to help anybody else that has the same issue, I did the following to fix it:
    1) I had to uninstall pyTorch, because I was using the wrong version. To do that I ran:
    pip uninstall torch
    2) I also had the wrong version of CUDA installed, my GPU (RTX 3060) is the Ampere generation and I needed to install a version of CUDA that works with that, which for me was version 11.3
    developer.nvidia.com/cuda-11.3.0-download-archive
    3) I then had to reinstall pyTorch, but a version that uses CUDA 11.3, the command for that is as follows:
    pip3 install torch torchvision torchaudio --extra-index-url download.pytorch.org/whl/cu113
    That fixed the issue for me, when I ran python -c "import torch;print(torch.cuda.is_available())" I now get a "true" return, meaning it uses the GPU.
    You can also run the following to see what version of pyTorch is currently installed:
    python -c "import torch;print(print(torch.__version__))"
    Hope that helps somebody 🙂

    • @spencer5028
      @spencer5028 Год назад +1

      I'd think the rtx 3060 12gb is the best bang for the buck

  • @android69_
    @android69_ Год назад +2

    is this advice still applicable in may 2023? i want to run stable diffusion to generate accurate facial renders of humans

  • @chrislloyd1734
    @chrislloyd1734 Год назад +13

    I just could not be bothered spending 20 mins reinstalling the software each time I needed to rent a GPU or use Collab. So I decided to buy a GPU and the one I chose was what you recommended. The 3060 and it was the 12 GB of RAM was the clincher, not to mention the price. It works great and is not that slow even when training I get around 7000 steps per hour.

    • @lewingtonn
      @lewingtonn  Год назад +4

      dude that's actually so nice to hear, thanks for letting me know!!!

    • @ogonbio8145
      @ogonbio8145 Год назад +1

      I’m looking at the exact same gpu. I’m just wondering what you paid because I’m seeing them on Amazon right now for upwards of 500 dollars which seems like a lot.

    • @lewingtonn
      @lewingtonn  Год назад +4

      @@ogonbio8145 500 is hella reasonable imo, I haven't baught one because my old 8gb one is still ok

    • @chrislloyd1734
      @chrislloyd1734 Год назад +2

      @@ogonbio8145 I am in the Philippines. I bought a used one from Korea. It was from an Internet café. It cost me $275 and was like new, works great. I had been looking for a while on "facebook market" and this was a good deal. Good deals are around, but you have to be patient and snap them up as soon as you see them.

    • @Chris-xo2rq
      @Chris-xo2rq Год назад

      @@ogonbio8145 Wait until the 4080 and 4060 comes out and competes with them, you should see the price drop.

  • @nobafan7515
    @nobafan7515 11 месяцев назад

    Hey, can I run stable diffusion if I have 2 x 4gb gpus (assuming they both are compatible to run stable diffusion), or will it not run unless one of the gpus can hold the entire model?

  • @u.h.d.4580
    @u.h.d.4580 Год назад

    This is why Render is an awesome project.
    I’d be down to deploy my idyll GPU to the network

  • @sakuraistrash2much
    @sakuraistrash2much 9 месяцев назад

    does stable diffusion need more cuda cores or more v ram?is rtx 3060 enough to run stable diffusion without lagging,hanging or stuttering?

  • @Pauluz_The_Web_Gnome
    @Pauluz_The_Web_Gnome Год назад +3

    What is that high pitched noise in your video? Your washing maschine? And really thanks 4 this info..I have ordered a 3060 (upgrading from a GTX 1060!!! Good card served me well, but was about time after six years of use) before I had seen your video, so I am really happy

    • @lewingtonn
      @lewingtonn  Год назад +1

      we have really big mosquitos in australia... thanks for letting me know

    • @Pauluz_The_Web_Gnome
      @Pauluz_The_Web_Gnome Год назад

      @@lewingtonn hahahaha

  • @GlebLebedevOnSoftware
    @GlebLebedevOnSoftware Год назад +1

    Thanks for video. Do you have any measurements how much time does it take to generate 16 images in a batch with certain number of steps. I'm currently running SD on 1070 and it take around 13 seconds to do 20 steps. So 16 images with 20 steps take 3 and a half mins to generate. The same thing on 2080 was taking only 3 seconds.

    • @Chris-xo2rq
      @Chris-xo2rq Год назад

      I can tell you my 2070 does about 6.2 iterations per second at 512x640, which would be 3.2 seconds for 20 steps, and 51 seconds for 16 of those images.

    • @GlebLebedevOnSoftware
      @GlebLebedevOnSoftware Год назад

      @@Chris-xo2rq 3090 does around 15-18 iterations per second. Does it worth extra money? I'm not so sure.

  • @LucasPereiradaSilva
    @LucasPereiradaSilva 9 месяцев назад +3

    The 3060 12GB is a perfect pick for the price. I've been running automatic 1111 webui and I get nearly 8it/s when rendering using Euler at default 512px. It can handle SDXL as well, but at 1200px it is kinda tough.

    • @FielValeryRTS
      @FielValeryRTS 7 месяцев назад

      I bought 3060 12gb too a month ago. Still haven't tested Stable Diffusion on it, hopefully it'll go smoothly.

  • @MrVitaliyAT
    @MrVitaliyAT Год назад

    is it possible use multiple GPUs and how& is it possible that sd sum their vram for work?

  • @PsycosisIncarnated
    @PsycosisIncarnated Год назад

    Im aiming to produce detailed high resolution works for album covers. Imagine hellish landscapes and large panoramic vistas. I'm using stable diffusion ui v2 with my ryzen 5 5600x cpu right now, its alright but very slow lmao. Takes like 30 mins to generate stuff usually as i want it. I have rx 6700xt but sadly only now i found out that this gpu doesnt work with ai art. (just as im getting into it)
    What do you recommend i buy for my artstyle? Should i rent a gpu on coreweave or something and just only use it when i need to generate the images? Do the rented gpus generate images fast?

    • @user-wvkd
      @user-wvkd Год назад

      legit sold my 2070 super for a 6800xt and now i want to trade back haha

  • @KeinNiemand
    @KeinNiemand Год назад

    As someone who primarily uses my gpu for gaming and only messes around with stable diffusion for fun actal speed does matter a lot more then just vram.

  • @BoxOfChocs
    @BoxOfChocs Год назад +1

    Cheers mate - you sound like a Kiwi. Just getting in to this and finding that my RTX 2060 6GB is a bit limiting. Was leaning towards the RTX 3060 12GB from PB Tech - which I think I will continue with - but you've opened my eyes to the possibility of things like Open Pod - hope you have an affiliate link!

    • @_loss_
      @_loss_ Год назад

      The 2060 is GTX

  • @SeanRynearson
    @SeanRynearson Год назад +3

    I was just wondering if someone had any info on this. Thanks. I was looking for more in dearth power and frequency settings for efficiency. Speed does matter when large batch multi parameter prompt computing :-)

    • @macguy9152
      @macguy9152 Год назад

      I was wondering what speeds you're getting generating images. I'm renting and only getting

  • @ilmondodifaxe8216
    @ilmondodifaxe8216 Год назад +3

    You can't make a 20 minute video without chapters! Come on!

  • @maxstepaniuk4355
    @maxstepaniuk4355 Год назад +4

    I would add OS to your table. Automatic1111 works with AMD GPUs on linux only. Windows, at least my 5700, doesn't work for this gui, still goes to CPU.

  • @jobney
    @jobney Год назад +1

    I'm still running a 980 Ti. Having issues with "interrogate CLIP." I get the primary prompt but it throws an error before it gets to the background descriptions.

    • @lewingtonn
      @lewingtonn  Год назад

      You're using a very old gpu there... honestly well done for even getting to this point

    • @jobney
      @jobney Год назад +1

      @@lewingtonnonly 6gb VRAM but overall it doesn't feel too slow.

  • @screamtheteam12345
    @screamtheteam12345 Год назад

    Hi! Thank you so much for the useful tutorial? I followed your advice and rented a gpu on runpod and got stable diffusion. However I keep getting the issue “RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cpu and cuda:0! (when checking argument for argument index in method wrapper__index_select)" I saw some people say that medvram needed to be turned off to fix this? But i wasn’t sure how to do that seeing as I am renting the device (I’m a total beginner haha) I was wondering if anyone has had this same issue and knows a possible solution? Thank you!

  • @dezoksiribonukleotid5011
    @dezoksiribonukleotid5011 Год назад

    I have NVIDIA GeForce GTX 1050 Ti, but while Stable Diffusion is running, it works at 1%, how can this be fixed or is it normal?

  • @macguy9152
    @macguy9152 Год назад +3

    Great guide, followed your advice and started renting but getting super slow speeds generating images on google colab (

    • @lewingtonn
      @lewingtonn  Год назад

      that's odd... I find colab quite fast

  • @policani
    @policani Год назад

    Running an AMD GPU and wasted 2 days trying to get Automatic1111 to run. Eventually switched to Shark after being informed that the Automatic wasnt compatible with AMD... It's a shame, because I really wanted to try creating videos using Stable Diffusion.

  • @TheIbdeathskull
    @TheIbdeathskull Год назад +4

    Are there gpu laptops that are optimal for stable diffusion?

    • @mumbai2istanbul
      @mumbai2istanbul 10 месяцев назад

      Omen, msi cyborg (better to have 16g ram and minimum 6gb graphics of Nvidia)

  • @RealShinpin
    @RealShinpin Год назад +1

    Do you think a 3080 12 g will be capable of effectively utilizing SD? In a fast and effective manner I mean.
    I wanted something bigger for gaming anyways.

  • @Sena-oo5hj
    @Sena-oo5hj Год назад

    how to run in sagemaker studio lab?
    already tried it but no luck :')
    i'm a noob.

  • @Chrisspru
    @Chrisspru Год назад +1

    i use a rtx 2060 12gb. its a 2060 super chip on a 3060 pcb. at stock settings it performs worse thana 2060 super, but it has very good oc capabilities and thereby outpaces the 2060 super. i run mine at 3060 clocks (+100mhz core and +500 mhz memory frequency in msi afterburner), at which point it behaves liek a 2060super- 3060 hybrid.
    i got mine for 299€

  • @ToweringTimoth
    @ToweringTimoth 10 месяцев назад

    Privacy is nice, so it's local gpu for me.
    Also, I found this vid at the end of my two day scavenging researching and was pleasantly surprised we had come to the same conclusion hah

  • @Remowylliams
    @Remowylliams Год назад +1

    Nice info vid, thanks and just a small comment: Using SD once a day. *chortle* For me, SD is like opening a 6' tall can of pringles. one chip leads to another and in a blink the damn can is empty and you're jonesing to get to the quick-e-mart to buy a case. But that's just me. I may have to go to AA for SD soon. Cheers

    • @lewingtonn
      @lewingtonn  Год назад +1

      that's a really apt analogy, I'm so far down the can that I made a youtube channel haha. I should probably take a break too...

  • @temporallabsol9531
    @temporallabsol9531 Год назад +3

    I use coreweave but I'm hitting industrial solutions too.
    A6000 is da way.

    • @lewingtonn
      @lewingtonn  Год назад +1

      why is that better than the 3090? Surely you don't need 48 GB for anything legal!

  • @enlightenedeyemusic
    @enlightenedeyemusic Год назад

    If i dont have Vram memory and Have 16 GB ram and 8 GB Gpu memory will it work?

  • @florentinhonorius613
    @florentinhonorius613 Год назад

    I have m1 pro 16gpu macbook. Will it be ok?

  • @Kevin-jb2pv
    @Kevin-jb2pv Год назад +5

    You skip the biggest benefit of buying a GPU vs. renting one! _You can game on it when you're not using Stable Diffusion!_ Also, if someone'e not into gaming, but they're seriously interested in Stable Diffusion to the point where they're considering investing in a new GPU, then I can pretty much guarantee you that they would also be interested in all the things a good graphics card can allow them to do in Blender, Photoshop/ GIMP, Premiere/ After Effects/ Vegas/ whatever video editor, Unreal, Unity, CAD, and alllll kinds of other stuff that you just can't do very well (or at all) with integrated graphics alone or on an old, outdated graphics card.
    Some of these people might have tried running these programs on their current computers only to get discouraged because it ran like trash and left them feeling like the software itself was bad or that they were just bad at using it right when the problem is very likely that they're trying to run resource-hungry programs at or below minimum spec.
    P.S., if that sounds like you but a new PC is out of the question right now, try hunting down older versions of whatever software you're trying to use. Older versions will have less features, but will run better on limited hardware. Also, you won't be able to _buy_ these old versions, so be safe ;)

  • @truesightgrabber
    @truesightgrabber Год назад

    What about SLI for Nvidia. Is it possible and is it worth ?

  • @kluih9872
    @kluih9872 Год назад

    Ty

  • @PsycosisIncarnated
    @PsycosisIncarnated Год назад +1

    What resolutions can an rtx 3060 make? I wanna make images with stable diffusion at around 1280*960 width with 100+ steps. These images take me like 4 hours with just my ryzen 5 5600x cpu.
    Would the rtx be able to handle such images? And make them even faster? Would appreciate someone if they could help me out.

    • @taurec1
      @taurec1 Год назад +1

      Hello, my rtx 3060 12 GB Eagle OC needs 2 min 54 sec for 1280*960 with 100 steps on a Ryzen 5959x. But its reaches the Limit of the VRam.

    • @taurec1
      @taurec1 Год назад

      Sorry Ryzen 5950x 🙂

  • @Ranguvar13
    @Ranguvar13 Год назад +3

    May want to add privacy to the reasons not to rent :)

  • @amanmadaan87
    @amanmadaan87 Год назад +1

    Hi all,
    I have RTX 3050, and 8GB RAM and still struggling to get the most out of Stable Diffusion. Any kind of help/suggestions would be appreciated.
    Thanks!

  • @exomata2134
    @exomata2134 Год назад

    Is 4070 ti a good pick for stable diffusion? The thing ia 3090 is very tempting with so much vram but its even more expensive while being older and used

    • @leucome
      @leucome Год назад

      12GB of ram piss me off because there is a lot of feature that wont run at all with only 12GB. It works yeah but it is the option for people who already have a GPU... If you buy with the intention of using stable diffusion then getting a used 3090 or a 7900xtx make more sense. I think.

    • @stevenp6761
      @stevenp6761 10 месяцев назад

      @@leucomewhat features won’t run on 12gb? I am hesitating between the 4070 TI, or the 4080, which has 16gb vram

  • @PepperDoom
    @PepperDoom Год назад

    Would a 12 gb RTX 3080 GPU be a good choice?

  • @frankschannel2642
    @frankschannel2642 Год назад

    I have a mining rig with seven GPU's (all 2080 or better). Can Stable Diffusion utilize the VRAM across more than one GPU? My other option is to sell multiple GPU's and invest in one of the newer GPU's with the $$$. Can someone answer whether SD (like Mandelbulbar) can recognize use ALL installed GPU's???

  • @SnoochyB
    @SnoochyB Год назад

    I bought the 3060ti which only comes with 8gb of VRAM! I could have had the 3060 with it's 12gb was there, and I didn't know! I thought the ti was always better. I am error.

  • @garen591
    @garen591 Год назад

    Another question, do you need an intel CPU to run ?

    • @AmirZaimMohdZaini
      @AmirZaimMohdZaini Год назад

      Either AMD or Intel, the CPU must be at least quad-core and also 3.0Ghz and above to run SD.
      Also, the CPU should have at least SSE4.1 feature.

  • @Synthlock
    @Synthlock Год назад

    I have rx 6900 xt and it is way slower than my first gen M1. I don’t even see use of 12gb gpu anymore in stable diffusion. I’m up to highest resolutions to provide professional images. Am I missing something here? How it is possible to work with 8gb of ram i have no idea.

    • @empty19941
      @empty19941 Год назад

      you're running it with cpu amd gpu isn't fully supported becauseit doesn't have cuda cores

  • @ArtiIntel-wl7su
    @ArtiIntel-wl7su 6 месяцев назад

    I'm still getting plenty of value out of my GTX 1070 8GB, but then again, I'm not training models.

  • @humdelala
    @humdelala 10 месяцев назад

    Sorry this is a bit long, but would this be suitable?
    Screen size 15.6in
    Resolution 1920 x 1080
    Refresh Rate 120Hz
    Battery
    Battery Run Time 7hours
    Processor
    Manufacturer Intel
    Series Intel Core i5
    Number 12450H
    Generation 12th Gen
    Cores 8
    Clock Speed 2.0GHz
    Memory (RAM)
    Installed Size 16GB
    Type DDR4
    Total Slots 2
    Spare Slots 0
    Maximum Capacity 16GB
    Storage
    tooltipSSD Capacity 512GB
    Graphics
    Graphics Type Dedicated
    Graphics Manufacturer NVIDIA
    Graphics card GeForce RTX 3060
    Integrated Graphics Intel Iris Xe Graphics
    Video Memory 6GB
    TGP - Total Graphics Power 105W
    Ports & Connectivity
    USB 3.0 2
    USB-C 1
    HDMI 1
    Bluetooth Wi-Fi 6, 11ax 2x2?

  • @ToniCorvera
    @ToniCorvera Год назад

    When talking about the price advantage of renting you totally overlooked the price of storage, which can potentially mark up the price given the sizes of models and such (and also makes calculations much less straightforward)

  • @bitelaserkhalif
    @bitelaserkhalif Год назад +4

    I use RTX 3060 12gb, because vram size is important for AI/ML things. VRAM too small: can't run
    Plus side is gaming performance is okay

    • @x1c3x
      @x1c3x Год назад

      How fast can you generate 512x512 or 786 and how many steps?
      Could you please share some numbers if you find the time :)
      My 1070 does
      Prompt: Summer sunset
      Steps: 20, Sampler: Euler a, CFG scale: 7, Seed: 67310028, Size: 512x512, Model: Default_model_1.4
      Time taken: 15.45s at 1.35 it/s
      With everything the same, just 768x768 it takes 35s.

    • @bitelaserkhalif
      @bitelaserkhalif Год назад

      @@x1c3x dunno how to benchmark them

    • @MartinDimitrov0
      @MartinDimitrov0 Год назад

      @@bitelaserkhalif Just do any 512x512 prompt with 28 steps and check in the cmd dialog how many seconds it took. On 1070 with xformers I get a 512x512 image in 15 secs, 25 secs for 512x768/768x512 image. So I'm wondering how much difference does the extra power and vram make.

    • @KillFrenzy96
      @KillFrenzy96 Год назад +1

      If you are still wondering, my RTX 3060 12GB can generate an image with those settings in less than 3 seconds at 7it/s. My RTX 3080 10GB gets around 14it/s with those settings. xformers was used for both tests. At 768x768 resolution, they generate at slightly less than half the speed.

    • @bitelaserkhalif
      @bitelaserkhalif Год назад

      @@KillFrenzy96 mine is around 3 to 5it/s (512x512 20 steps no xformers)
      Because undervolting, full load at gaming is 60 to 80-ish watt

  • @MortezaYousefi
    @MortezaYousefi Год назад +1

    can we use 2 VGA for example 2 3060 ? is there any way use 2 VGA ?

    • @lewingtonn
      @lewingtonn  Год назад

      I'm not 100% sure what you mean. According to quora "VGA stands for Video Graphics Array, and is one of the many interfaces that you can find that a GPU utilizes. It’s an interface for displaying pictures on a monitor." What kind of GPU do you have?

    • @MortezaYousefi
      @MortezaYousefi Год назад +1

      @@lewingtonn I have 2 1070 now and as I am a 3d artist I can use the process power of both in the blender
      blender also I think wrote in python and I think maybe I can do similar with SD
      because I think memory for SD is more important buying an Nvidia card with high memory I think is more expensive than buying 2 cards or more like mining of ETH that uses many VGA cards

    • @lewingtonn
      @lewingtonn  Год назад

      @@MortezaYousefi sadly this won't work :( using multiple GPUs at the same time is very hard do to and the AUTOMATIC1111 people haven't gotten around to it yet. You will basically need NVIDIA to use anything new.

    • @MortezaYousefi
      @MortezaYousefi Год назад +1

      @@lewingtonn thank you for your answer

  • @tmfuco
    @tmfuco Год назад +1

    Thanks for this video!
    But, at the same time, it left me somewhat disappointed, since it completely skipped over (the admittedly somewhat more technical) question I specifically had. 😛
    Situation is like this: I have a workstation laptop that is about 5 years old now, with 32GB RAM (64GB MAX) and a quadcore i7-6820HQ CPU. Obviously the internal GPU - even if I WOULD have chosen a configuration with a beefy +4 GB one instead of the 2GB I ended up with - would always be inadequate. But I already figured out that my laptop supports usage of an eGPU enclosure over Thunderbolt III, and some of those enclosures support at least upto even RTX 3090 Ti.
    I was wondering if anyone had anything useful to share about THESE particular questions I'm asking myself:
    1) I know using GPUs in a Thunderbolt III enclosure carries a performance penalty compared to the same GPU in a desktop PC; I was wondering however, whether this experience and test-based fact, mostly established on GAMING workloads, was equally relevant for GPU-compute and more specifically AI-compute and EVEN more specifically Stable Diffusion RENDERING tasks? Is it expected to be WORSE, BETTER or instead LARGELY IRRELEVANT?
    2) a bit along the same lines: would my 5-year old quadcore+hyperthreading laptop CPU somehow be a limiting factor, making a particular class of GPU (say, above RTX 3060 or RTX 3070) just not worth it because it would be held back by the CPU anyway? So an extra investment in a 3080 or even 4070/4080/3090 (let's forget about 4090 :p ) would make absolutely NO sense?
    Would be great if someone had specific experience/insight !

    • @lewingtonn
      @lewingtonn  Год назад

      haha yeah, definitely a bit of a niche case there. I can confidently say that I have no idea about the thunderbolt, since I have no experience there. I will say though that i's unlikely that a 5 year old CPU (which isn't that old really) will hold you back, since most of the transformer computations from stable diffusion occur inside the GPU alone, without bouncing back to the CPU constantly.

  • @spencer5028
    @spencer5028 Год назад +1

    I'd think the rtx 3060 12gb is the best bang for the buck

  • @Sanguen666
    @Sanguen666 Год назад +3

    this is a terrible guide. u didn't explain anything. does bus width matter? if so how much? does vram speed matter? if so how much? does cuda count matter? etc... this guide is good for people who have no idea where's left and where's right.

  • @nodewizard
    @nodewizard Год назад

    Best GPU is an NVIDIA RTX with 12 GB or more of VRAM (i.e. 3090, 4000 series)

  • @artdigitalenergy3917
    @artdigitalenergy3917 Год назад +1

    ☝️😄 I guess now you have to show us how to set up rent runpod GPU … thank you

    • @lewingtonn
      @lewingtonn  Год назад

      haha I'll see if I have time!

  • @stavsap
    @stavsap Год назад

    buying a gpu have other benefits, like gaming and video editing, and you can resale it after some time, renting is a pure waste of money, unless you want to only poke around for short time with SD and thats it.

  • @kessahmed647
    @kessahmed647 11 месяцев назад

    Just buy RTX 3069 used with 12GB VRAM until the next best card is released that has more speed or vram GB, can find under 200. Also use an SSD

  • @davidjacobs8558
    @davidjacobs8558 Год назад

    How about Intel iRIS ?

  • @Fate5591
    @Fate5591 4 месяца назад

    My GPU, a 1050, only has 2GB of VRAM. It seems impossible to run Stable Diffusion sigh. I have been experimenting with AI-generated content for two years on sites like PixAi, Google Colab, Nai, and more. Currently, I am contemplating generating content on my own PC, but it looks like I'll need to save money to buy the latest GPU

  • @FilmFactry
    @FilmFactry Год назад +5

    I've wanted to know this for aehile. I suspect there will be AI optimized hardware soon. Not video game oriented. Like mining rigs.

    • @ThatBonsaipanda
      @ThatBonsaipanda Год назад

      There already is, most of the checkpoints for Stable Diffusion are trained using cards like the NVidia A100.

  • @amafuji
    @amafuji Год назад +1

    1660 TI laptop master race

    • @AmirZaimMohdZaini
      @AmirZaimMohdZaini Год назад

      But whack at FP16 operation. Every GTX 1600 series often suffering from black screen when producing AI images, resulting additional commands must be run to overcome this issue but eventually ended up using more memory.

  • @mikemaldanado6015
    @mikemaldanado6015 Год назад +1

    Do not use collab. No PRIVACEY. There is no such thing as free, at this point everyone should know that if it's free, they're using your data. PRIVACY PRIVACT. But your own GPU.

  • @flamescales7422
    @flamescales7422 Год назад +1

    Meanwhile Me running stable diffusion on google colab 🗿

  • @ianhzin
    @ianhzin Год назад +1

    Unfortunately Colab have banned free users to use Stable Diffusion

  • @mikesavad
    @mikesavad Год назад +1

    Boy this didn't really help me. It was more about renting a pc, which I guess was interesting, but I really was hoping that such a long video would be about video cards and not a long ad for runpod?

  • @abaape313
    @abaape313 Год назад

    Intel don't work with it?

  • @trashcontent9637
    @trashcontent9637 Год назад

    I guess I wont run Stable diffusion on my RX 6950 XT... What a shame.

  • @meikai2135
    @meikai2135 Год назад +1

    RTX 3060 12gb works pretty well with Stable Diffusion.
    512x512 need about 15-20s per one pic. 1000x+ 2-4 mins depens how large resolution it's need to render and it actually pretty great compare with my old 1050ti which 512x512 was need about 2-3mins per ONE pic.

    • @squoblat
      @squoblat Год назад

      I must be doing something wrong, I'm running stable diffusion on a 3060ti and it's taking a minute plus an image at 1024x1024 and 30ish steps. No idea why my speeds are so horrendous.

    • @PsycosisIncarnated
      @PsycosisIncarnated Год назад

      @@squoblat its not like a minuite is a long time you guys jeez xD

    • @squoblat
      @squoblat Год назад

      @@PsycosisIncarnated it is when you're running a batch of 300 images

  • @erikschiegg68
    @erikschiegg68 Год назад +2

    AMD RX 7900 XT has for the first time also the fast matrix calculation cores and will probably beat the 4090. And if you make the math comparing the cost of the lastest GPU every year against 2800 Hours of a far better GPU in a mainframe, you might be better off online.

  • @jimmysrandomness
    @jimmysrandomness Год назад

    how about the GeForce RTX 3060 16gb ram ?

    • @mcdazz2011
      @mcdazz2011 Месяц назад +1

      The RTX 3060 comes with either 8GB or 12GB of VRAM, not 16GB. You might be thinking of the 4060 TI, which does come in a 16GB VRAM version (along with an 8GB version).
      I'm lucky enough to have the 4060 TI 16GB version, as well as two 3060s, but would love a 3090/4090 with 24GB of VRAM, but they are well out of reach for me.

  • @ImNotQualifiedToSayThisBut
    @ImNotQualifiedToSayThisBut Год назад +1

    So basically more vram = better?

  • @artdigitalenergy3917
    @artdigitalenergy3917 Год назад +1

    Can we use rent Gpu ? ☝️😄

  • @88farrel
    @88farrel Год назад +3

    Alright, I'm gonna sell my properties and buy 2 rtx 4090

    • @lewingtonn
      @lewingtonn  Год назад +3

      haha isn't that kind of the opposite of what the guide suggested???

    • @stepahinigor
      @stepahinigor Год назад +1

      Actually you don't need two, it doesn't work on both at the same time

    • @savekillqqpsounds8473
      @savekillqqpsounds8473 Год назад

      ​@@stepahinigor he can have 2 pc working at the same time doubeling the speed lol

    • @timeless8285
      @timeless8285 Год назад

      @@stepahinigor you should be able to

  • @itsallaboutyou4968
    @itsallaboutyou4968 Год назад

    Anyone heard about renting a GPU?
    And Google Colab? 😅

  • @creepyinsane6805
    @creepyinsane6805 Год назад

    *Thx for your effort, but i do not think, that his has been a good guide.* Imho you seem to be biased refering to the question whether to rent our to buy a GPU, neglecting the disadvantages of renting a GPU and the advantages to buying one. That has been most obvious when you said, setting up Stable Diffusion every time you want to use it for 10 minutes is not a problem at all while installing a GPU would be smth like very hard work. Actually installing a GPU ist one of the easiest things when building a computer and doing it's seetings is not hard either. But the main point is: you just have to do it once, so it will save very much time compared to setting up Stable Diffusion every time you want to do smth there. Not taking in count that there can be done smth else with a GPU than using Stable Diffusion is careless, at the best. Another point which has unfortunately not been mentioned is, that all money which one puts in renting a GPU is gone forever, while it is still possible to sell a GPU some once bought.I don't believe that renting a GPU is cheaper for a large majority of people. At the end of the video you said one more really irritating thing, which is that it wouldn't matter if you have to wait 90 seconds for a result or 2 minutes. I don't know how time consumption can be not important for someone using Stable Diffusion. It seems to be a very eccentric pov imho. And that seems to be to case for many aspects mentioned in the video. *No offense*

  • @adriangpuiu
    @adriangpuiu Год назад

    im using it with a 3090 24Gb Ram ...poors man setup

    • @jannekallio5047
      @jannekallio5047 Год назад

      considering that card, but not sure if it really makes any sense.. how much fasted it is when using stable diffusion remains the question

    • @adriangpuiu
      @adriangpuiu Год назад

      @@jannekallio5047 try voltaml. almost realtime ..

  • @artdigitalenergy3917
    @artdigitalenergy3917 Год назад +1

    ☝️😄i have use Colab to generate video before , issue is took so long 3 second video took like 30 mins ..

    • @lewingtonn
      @lewingtonn  Год назад

      yes that's kind of expected, but this would also happen locally. Colab GPUs are really beefy

  • @amoghghade1
    @amoghghade1 9 месяцев назад

    me watching this take 45min for 30 steps 😂

  • @beatsNstrings
    @beatsNstrings Год назад

    Why aint he telling ppl that 8gb is CRAP when trying to train embeddings. ???

  • @pika9985
    @pika9985 Год назад +1

    that was a very stupid a*s question , the best GPU for this mess that normal people can buy is the rtx 4090 , or any more powerful for next years scams ..
    but for whales , the best one is the A100 from Nvidia also ... i need one rtx 4090 ..

  • @Airbender131090
    @Airbender131090 Год назад

    For price of 4090 you can buy 60 months of unlimited subscription of Midjourney (Witch is WAAAAY better than SD especially in version 5 that is going to come in 10 days. )
    If I am wrong - tell me why.

    • @WhySoBroke
      @WhySoBroke Год назад +1

      Lora training, Dreambooth, checkpoint merging, controlnet, privacy, offline use... seriously I could go on and on... every single day there is something new on A1111.

    • @stepahinigor
      @stepahinigor Год назад +2

      MJ is a black box, a toy, you literally have no control over the process at all. You just get random results. With SD you can use LoRa, Dreambooth, ControlNet, you have inpainting, outpainting, sketch tools, you can choose any resolution, you can train your model or download other people's models... MJ is a toy. SD is already a professional tool, people make money with it.

    • @nz69
      @nz69 Год назад

      u cant make booba in midjourney😛

  • @bustedd66
    @bustedd66 Год назад

    4 to 6 hours a week is nothing. i do that a day.

    • @lewingtonn
      @lewingtonn  Год назад +1

      Huh, what are you generating? Animations or something? For me it's almost always just still images

    • @bustedd66
      @bustedd66 Год назад +1

      @@lewingtonn yes srry i figured that out i thought i was doing somethng wrong and wasting credits :)

  • @ThatGuyFromDetroit
    @ThatGuyFromDetroit 11 месяцев назад

    $2/hr for a 3070 and you call that a good deal for 'all but the most dedicated SD users'? lol, ignoring the fact that i also play games with it, i have a 3080ti and just using that $2/hr measure, i'd have already broken even. i'd have spent like $100 just over the last few days alone, from training these LORA models. though to be fair, i sometimes leave big batches running overnight or while i'm at work, plus i used to have the auto11webui set up so that my online friends could use it too. i've probably clocked thousands of hours of active SD use on this GPU. i'd also have ripped my eyes out by now if i had to reinstall then upload all of my checkpoints and loras and embeddings every time i wanted to use it. ... huh, i guess i am a 'dedicated SD user' after all

  • @gerza71
    @gerza71 Год назад +1

    Your are sponsored by Nvidia

  • @nazmeenshaikh2980
    @nazmeenshaikh2980 9 месяцев назад

    I have gtx 1050 ti can I use? 😂

  • @pastuh
    @pastuh Год назад +1

    video starts at 19:02
    Wrong title = wrong info