This New Version of FLUX Runs on 6GB VRAM and is 35% Faster - Flux NF4 Install Guide

Поделиться
HTML-код
  • Опубликовано: 26 янв 2025
  • Flux NF4 is a highly optimized version of the Flux AI model designed to run efficiently on GPUs with as little as 6GB VRAM. It delivers nearly the same image quality as the original model but with significantly faster performance-up to 35% faster-making it perfect for those with lower-end hardware. With built-in VAE and compatibility with platforms like ComfyUI and Forge, Flux NF4 makes high-resolution image generation accessible and quick, even on less powerful systems.
    Links:
    NF4 Model: huggingface.co...
    Workflow: turboflip.de/f...
    Commands: git clone github.com/com...
    pip install -r requirements.txt
    #stablediffusion #flux1 #chatgpt #FluxNF4

Комментарии • 89

  • @ImShubhamY
    @ImShubhamY 4 месяца назад +11

    flux is heavily trained on high quality images, so even if you want, its hard to get low quality images such as amateur selfie.
    I'd rather wait longer than mass produce lower quality images. I usually lower the resolution till i work on my prompt and other settings and once i get what i want, i increase the resolution and then generate more. Using OG Flux dev on just 8gb vram card, takes 2 mins for 1024*1024 23 steps but images are absolutely worth it. Controlnet is a problem though.

    • @ImShubhamY
      @ImShubhamY 4 месяца назад

      So i tried it and it took slightly less than 50% of time compared to OG flux dev, meaning 2x faster while generating similar image, but on my pc i face issues sometimes where generating takes more time than 6x compared to flux dev and 12x more than nf4, it happens every 3-4 images, so sometimes it works, sometimes it falls on its head. Very inconsistent, it will definitely be slower if i generate 100 images compared to regular flux dev. I know it has to do something with vram being full, but flux dev, a 24gb model works consistently on 8gb vram, nothing seems to compare to it.

    • @fus3n
      @fus3n 4 месяца назад +2

      @@ImShubhamY nf4 was really slow on my 8gb card

  • @06ju
    @06ju 4 месяца назад

    Great review! Really started generating quickly! Thanks to the author!

  • @gabrielyeamans7289
    @gabrielyeamans7289 4 месяца назад +8

    Hey, I could be wrong, but I think that Flux GGUF is even faster?

  • @SVPREMVS
    @SVPREMVS Месяц назад +2

    Error: All input tensors need to be on the same GPU, but found some tensors to not be on a GPU:
    [(torch.Size([393216, 1]), device(type='cpu')), (torch.Size([1, 256]), device(type='cuda', index=0)), (torch.Size([1, 3072]), device(type='cuda', index=0)), (torch.Size([12288]), device(type='cpu')), (torch.Size([16]), device(type='cpu'))]

    • @razwangazi
      @razwangazi 28 дней назад

      need nvidia card or upgrade nvidia driver

  • @XirlioTLLXHR
    @XirlioTLLXHR 4 месяца назад +3

    Am I the only one getting missing node type error when trying to load CheckpointLoaderNF4? This is the 2nd guide I follow and it's literally the same thing that I am doing... and it doesn't work for whatever reason.

    • @ImShubhamY
      @ImShubhamY 4 месяца назад

      you have install the custom node "ComfyUI_bitsandbytes_NF4", and its very wip, you have to set comfyui manger to dev channel and sometimes reduce the security in config file from "normal" to "weak" or "normal-"

  • @z-inkp6478
    @z-inkp6478 4 месяца назад +1

    8:30 when I paste the link, it says it doesn't recognize it and cant download

    • @elpixel6135
      @elpixel6135  4 месяца назад

      make sure you have git installed then paste this: git clone github.com/comfyanonymous/ComfyUI_bitsandbytes_NF4.git

    • @z-inkp6478
      @z-inkp6478 4 месяца назад

      @@elpixel6135 it worked but the next one where u paste pip install -r requirements.txt, it says it doesn't recognize "pip". do I have to install something else?

    • @z-inkp6478
      @z-inkp6478 4 месяца назад

      @@elpixel6135 that worked but the next step where you paste pip install -r requirements.txt, it says "pip is not recognized, do I have to install something else?

  • @mupmuptv
    @mupmuptv 4 месяца назад +5

    I’ve heard about a new technology called GGUF. Is it true? How does it compare to NF4?

  • @erenyaegar-tx2dd
    @erenyaegar-tx2dd Месяц назад +1

    error: KSampler Expected a cuda device, but got: cpu , but i have gpu with cuda enabled

  • @petertremblay3725
    @petertremblay3725 Месяц назад

    Dragging the image didn't work to get the workflow, is there another way?

  • @eukaryote-prime
    @eukaryote-prime 4 месяца назад

    Flux pro is wild!

  • @saberkz
    @saberkz 4 месяца назад +1

    what about text to image ? is this image to image?

  • @duytatdt4884
    @duytatdt4884 3 месяца назад +2

    my 2060 6GB takes 30 minutes to create a photo :v

  • @possiblynotrohit
    @possiblynotrohit 4 месяца назад

    does it support img to img and controlnet

  • @victorprestaia9701
    @victorprestaia9701 4 месяца назад

    On the step "pip install -r requirements.txt" I get the error "'pip' is not recognized as an internal or external command,
    operable program or batch file."
    How to solve this?

    • @NewsFixTV
      @NewsFixTV 4 месяца назад

      i got this problem too...any solutions?

    • @victorprestaia9701
      @victorprestaia9701 4 месяца назад

      @@NewsFixTV No, the story ended with me breaking comfi and reinstalling it

    • @0000000Mario0000000
      @0000000Mario0000000 2 месяца назад

      I think that Linux and Python installed is necesary. You can verify that.

  • @anianait
    @anianait 4 месяца назад +12

    I wouldn't call it New Version of FLUX ... New version of Flux would be Flux.1.2 or Flux.2 .... I would call that an "Altered Version" or "New Altered Version" or "New Quantified Version" ....

  • @motopaediatheview9284
    @motopaediatheview9284 4 месяца назад +2

    Original flux-dev and flux-snell works with GTX 1060 6GB VRAM already. Slow, but it does the job...

    • @synthoelectro
      @synthoelectro 4 месяца назад +1

      Yeah, I can't get away from the original, nothing else works right.

    • @rifatshahariyar
      @rifatshahariyar 4 месяца назад

      Which veraion suitable for me,rtx3050 vram 4gb

    • @mikrodizels
      @mikrodizels 4 месяца назад

      I have the same GPU as you. How many seconds it takes for you to generate one 1024x1024 flux-dev and flux-snell picture?

    • @motopaediatheview9284
      @motopaediatheview9284 4 месяца назад +1

      @@mikrodizels flux-dev take long to load - 1,500 seconds, but generate the image is only 120 seconds/iterations. Since flux-dev does most jobs in 6-10 steps it is about 15~20 minutes per image generation. I tend to use 1024X768 in either portrait/landscape.
      I load on an old HDD and I believe that is why loading the model is so long (well-relatively).
      I haven't use flux-snell recently, will check another time.
      I have time, it is for fun, so I don't mind much.
      CPU is i5 13600K with 32Gb RAM, all stock standard.

    • @eukaryote-prime
      @eukaryote-prime 4 месяца назад +1

      I can do flux on my 980ti 6gb. 1280x768 is like 6 minutes an image for me and I'm shocked that I can even generate that size...

  • @isaacmartinez9759
    @isaacmartinez9759 4 месяца назад

    Can I use a lora node to generate consistent character images?

  • @ItamPT
    @ItamPT 4 месяца назад

    Thank you

  • @sinayagubi8805
    @sinayagubi8805 4 месяца назад +2

    You didn't say how much VRAM the schnell model was using

    • @elpixel6135
      @elpixel6135  4 месяца назад

      12GB, you might be able run it with only 8GB of VRAM but it will take longer to generate

  • @kanto1971
    @kanto1971 4 месяца назад

    NF4 is deprecated in Comfyui

  • @MuckoMan
    @MuckoMan 4 месяца назад

    How are you using FLUX Pro on a local machine? Set up an addict please lol.

  • @fan-funrecut
    @fan-funrecut 4 месяца назад +2

    FAILED: ComfyUI_bitsandbytes_NF4 [EXPERIMENTAL] And if i load this Workflow, its not loading. It says some errors, too: Invalid workflow against zod schema:
    Validation error: Required at "last_link_id"; Required at "nodes"; Required at "links"; Required at "version"; Required at "last_node_id"

    • @cekuhnen
      @cekuhnen 4 месяца назад

      same issue - thats why I honesly dislike comfyUI and such - always issues with the scripts and such ...

  • @erenyaegar-tx2dd
    @erenyaegar-tx2dd Месяц назад

    All input tensors need to be on the same GPU, but found some tensors to not be on a GPU:
    [(torch.Size([4718592, 1]), device(type='cpu')), (torch.Size([1, 3072]), device(type='cuda', index=0)), (torch.Size([1, 3072]), device(type='cuda', index=0)), (torch.Size([147456]), device(type='cpu')), (torch.Size([16]), device(type='cpu'))]

  • @snct6342
    @snct6342 4 месяца назад

    do you know how I can add a lora loader into this?

  • @jameshastie3864
    @jameshastie3864 4 месяца назад

    SamplerCustomAdvanced
    'ForgeParams4bit' object has no attribute 'quant_storage'

  • @Aristocle
    @Aristocle 4 месяца назад

    which is the fastest model in the case of a 12 GB RAM video card?

  • @8BitRetroRabbit
    @8BitRetroRabbit 4 месяца назад

    is flux made from germany because the flux "schnell" (fast)?

    • @burtpanzer
      @burtpanzer 4 месяца назад +1

      Yes, Germany is also where the black forest is.

    • @8BitRetroRabbit
      @8BitRetroRabbit 4 месяца назад

      @@burtpanzer i seen it, "Freiburg im Schwarzwald, Deutschland" is from me 110-120km away 🤣

    • @burtpanzer
      @burtpanzer 4 месяца назад +1

      @@8BitRetroRabbit Okay, I'm in California but it was an easy question to answer...

    • @burtpanzer
      @burtpanzer 4 месяца назад +1

      @@8BitRetroRabbit I guess it wasn't mentioned in this video but the German software company that makes Flux is called Schwarzwald. =D

    • @8BitRetroRabbit
      @8BitRetroRabbit 4 месяца назад

      @@burtpanzer i never heard of it befor, so no good marketing for me :D

  • @quercus3290
    @quercus3290 4 месяца назад +7

    if your running flux on 6gig with an NF4 model, you may as well use XL at that point.

  • @hatuey6326
    @hatuey6326 4 месяца назад

    one thing i don't understand with flux schnell ( i don't use the others ) is that i never have a realistic result using euler. I changed to LCM i it's far better

    • @ImShubhamY
      @ImShubhamY 4 месяца назад

      Yes schnell doesn't give similar image (realistic) despite same settings

  • @rezvanbagheri1901
    @rezvanbagheri1901 4 месяца назад

    You show memory, not vram
    VRAM is more limited

  • @tjw2469
    @tjw2469 4 месяца назад

    Flux pro obviously has better lighting compare to other versions.

  • @FloetschMaster
    @FloetschMaster 3 месяца назад

    My flux dev is sooo slowly in comparison to SD15 including hiresfix and adetailer

    • @elpixel6135
      @elpixel6135  3 месяца назад

      flux is a very resource intensive model so it will naturally be slower than sdxl or sd1.5

    • @FloetschMaster
      @FloetschMaster 3 месяца назад

      @@elpixel6135 at this moment i see no advantage of using flux, i have worse results in comparison to sd15 if it comes to realism. and its way slowler i mean x10 slowler by less resolution..

  • @One_Harmony
    @One_Harmony 4 месяца назад +3

    Im running a version of comfyui-zluda because my GPU is amd MSI armor radeon RX 580 with 8g vram. But I need to use flux.1. Please teach me how to. Any video tutorial for comfyui-zluda flux.1 ? Thanks. 🙏🏻

  • @Nightowl_IT
    @Nightowl_IT 4 месяца назад

    Does it work on AMD?

    • @elpixel6135
      @elpixel6135  4 месяца назад

      you might be able to get it working on AMD GPU's but its not that simple. You might want to stick with online image generators

  • @youMEtubeUK
    @youMEtubeUK 4 месяца назад +1

    I can use young on flux pro

  • @dolboeb-tz4bw
    @dolboeb-tz4bw 3 месяца назад

    No tworking with 5700xt. Model is not even loading in to a memory

  • @maranathasam
    @maranathasam 4 месяца назад

    1:02 Flux schnell looks better and more realistic

    • @elpixel6135
      @elpixel6135  4 месяца назад

      sometimes it does look better then the other versions

  • @eiermann1952
    @eiermann1952 4 месяца назад

    age xx works always

  • @DodiInkoTariah
    @DodiInkoTariah 4 месяца назад

    Seems not to work on Mac at all

    • @cekuhnen
      @cekuhnen 4 месяца назад

      use DiffusionBb and save yourself all that comfyUI A1111 nightmare

  • @shirleywang9584
    @shirleywang9584 4 месяца назад

    hi i'm tess from Digiarty Software. Interested in a collab?

  • @frankieb.fernandez79
    @frankieb.fernandez79 3 месяца назад +1

    Yor process is already outdated and not longer usable. Shoot the video again and repost.

  • @riered9reda101
    @riered9reda101 4 месяца назад +1

    stay wit sd. flux is sfw

    • @netneo4038
      @netneo4038 4 месяца назад +4

      Not if you apply a nsfw lora to it.

    • @riered9reda101
      @riered9reda101 4 месяца назад

      @@netneo4038 can it?

    • @mikrodizels
      @mikrodizels 4 месяца назад +1

      PonyXL models now are getting am amazing too

    • @YSNReview
      @YSNReview 4 месяца назад

      What's the best ver of SDXL and model?, what is the best for Auto1111.. I do not like Comfyui.

    • @mal2ksc
      @mal2ksc 4 месяца назад

      FLUXTASTIC, FLUX C4PACITOR, and scg-anatomy already exist to fill that gap, with more LoRAs coming out all the time.

  • @androsforever500
    @androsforever500 4 месяца назад +1

    I get this message when updating all: FAILED: ComfyUI_bitsandbytes_NF4 [EXPERIMENTAL]