ComfyUI Flux Upscale:- Using Flux ControlNet Tile & 4x Upscale Simple Workflow

Поделиться
HTML-код
  • Опубликовано: 20 сен 2024
  • I have created is a basic flux tile controlnet workflow, powered by InstantX's Union Pro.
    Currently, it is the only tile controlnet out there.
    Tile is a great option to have when trying to upscale an image to super high quality.
    Try it out for yourself!
    Whether you're working with low-resolution images or blurry content, Tile ControlNet is an excellent tool for enhancing them to stunning HD quality. Don’t miss out-try it for yourself and see the amazing results!
    Workflow Overview:
    Uses InstantX's Union Pro model
    Supports multiple ControlNet modes
    Great for upscaling blurry or low-quality images
    ControlNet: huggingface.co...
    huggingface.co...
    huggingface.co...
    huggingface.co....
    Workflow: comfyuiblog.co...
    ------------------------------
    #comfyui #comfy #comfyuiflux
    #flux
    ------------------------------

Комментарии • 22

  • @FredFraiche
    @FredFraiche 21 час назад +2

    Let it be know that this is a workflow for SUPER LOW Resolutuion sub 512, so this is a great help if you are upscaling a image of a scanned stamp. It will not work with even basic 832x1216, so do not get fooled.

  • @JoeBurnett
    @JoeBurnett День назад

    Thank you for sharing your workflow! Liked and now subscribed.

  • @spiritform111
    @spiritform111 21 час назад +1

    hm... for some reason it is giving my images a bit of a double vision / blur... it's not as sharp. is this a gguf thing?

  • @lucifer9814
    @lucifer9814 2 дня назад

    Thanks a ton for this workflow and tutorial, I just need to find the right clip models and this should work for me !! It's unfortunate that I never downloaded any of the clip models used here, so I gotta look for them

  • @muthurajradirajjj
    @muthurajradirajjj 20 часов назад

    just noticed, all the asset names are changed to "diffusion_pytorch_model.safetensors"

  • @DavidDji_1989
    @DavidDji_1989 День назад +1

    what do you mean by low VRAM ? below 8 GB ? below 16 GB ?

  • @97BuckeyeGuy
    @97BuckeyeGuy 3 часа назад

    When I run this, my resulting image looks absolutely awful. It's a bit burry and totally destroyed the face of the man in the starting image. This ControlNet must be pretty weak.

  • @usserFunnySpecific
    @usserFunnySpecific 2 дня назад +1

    Hello Where to download all these models I go to the same place you show where it says “resources” on huggingface, but there are only models “diffusion_pytorch_model.safetensors”.

    • @ComfyUIworkflows
      @ComfyUIworkflows  2 дня назад

      huggingface.co/Kijai/flux-fp8/tree/main
      huggingface.co/lllyasviel/flux_text_encoders/tree/main
      huggingface.co/nerualdreming/flux_vae/tree/main.
      here are the check point and Text Encoder File.. & VAE

  • @captainpike3490
    @captainpike3490 2 дня назад

    Great tutorial! I wonder if it can run on 12GB VRAM. When placing nodes, can they be connected in order? It's hard to follow. Is it possible to keep workflows neat and tidy using groups? Thank you

  • @graphilia7
    @graphilia7 2 дня назад +1

    Thanks for the video. I get this error: "KSampler: mat1 and mat2 shapes cannot be multiplied (1x768 and 2816x1280)"
    Can you please help, what should I do? my image is 1888x1056px

    • @ComfyUIworkflows
      @ComfyUIworkflows  День назад

      yes, it is due to the image size,, this workflow is best of Blurry images.. has you mention sizer of image is 1888x1056px You the image is already Optimized... not work on this Resolution, Your can try Supir For that...

    • @graphilia7
      @graphilia7 День назад

      @@ComfyUIworkflows Many thanks, I will try 👍

    • @helveticafreezes5010
      @helveticafreezes5010 День назад

      @@ComfyUIworkflows I don't think that is the issue. There is something wrong in this workflow. I tried multiple image sizes from 512x512 down to 66x66 and I get this same error every time. All these images were under 100kb and were low quality. Please do some testing and quality control when putting out your information so you're no wasting people's time.

    • @ComfyUIworkflows
      @ComfyUIworkflows  День назад

      @@helveticafreezes5010 The workflow is working.. tested.. what error you getting in cmd & make sure dual clip loader type is flux

  • @97BuckeyeGuy
    @97BuckeyeGuy 2 дня назад +3

    You should be upfront about how much VRAM this workflow will need. You're not going to be able to run this locally without at least 24GB of VRAM.

    • @ComfyUIworkflows
      @ComfyUIworkflows  2 дня назад

      @@ZERKA-p4b it will work if you have 6 GB VRAM You have to use GGUF Loader

    • @lucifer9814
      @lucifer9814 2 дня назад +3

      I am running this on a 8gb 4060,takes a while but it works just fine !! Besides he did mentioned the 2 variations at the start of the video.

  • @christofferbersau6929
    @christofferbersau6929 15 часов назад

    where is the controlnet model??

  • @fontenbleau
    @fontenbleau 13 часов назад

    not same face, a different man in upscale, the main problem with all these tools, which preventing to use for any serious restoration. It makes "eye candy" result as usual-the core script of all this Ai, but nothing related to original.