Seamless Outpainting with Flux in ComfyUI (Workflow Included)

Поделиться
HTML-код
  • Опубликовано: 20 сен 2024

Комментарии • 23

  • @Bpmf-g3u
    @Bpmf-g3u 3 дня назад

    I run Flux on mimicpc and it handles details quite well too, the proficiency made for a surprisingly graphic experience.

  • @dameguy_90
    @dameguy_90 7 дней назад

    You are a genius. My subscription is worth it.

    • @my-ai-force
      @my-ai-force  7 дней назад

      Thanks a ton for your support.

  • @philippeheritier9364
    @philippeheritier9364 7 дней назад

    It works very very well, a very big thank you for this brilliant tutorial

  • @GrocksterRox
    @GrocksterRox 8 дней назад

    Very well thought out. Kudos!

    • @my-ai-force
      @my-ai-force  7 дней назад +1

      Thanks for your kind words.

  • @CasasYLaPistola
    @CasasYLaPistola 6 дней назад +1

    Thanks for the video and the workflow, I've used it and everything works fine until it reaches the Flux group, and the problem is that I don't know in which directory I should copy the flux model, because if I understood correctly, the flux models don't go in the same directory as the 1.5 and SDXL checkpoints. In short, when I copy them there the workflow gives me an error. Can you tell me where to put it? Also, until now to use flux models I didn't use the usual "load checkpoint" node, instead I used the "dualclip loader" node.

  •  6 дней назад

    This is wonderfully good work, thank you for sharing! One question, I can only guess where to place the initial image with the x y parameters. Is there a better way to do this? Anyway, great!

  • @gardentv7833
    @gardentv7833 6 дней назад

    after many models re-downloads, its work, thank you, 2 days to figure out

    • @ritikagrawal8454
      @ritikagrawal8454 3 дня назад

      I was able to download all nodes and models, but my comfyui is not loading only. Did you face similar issue, if not, still can you please tell what worked for you?

  • @happyme7055
    @happyme7055 5 дней назад

    Stunning!!!!! First working Outpaint ever ;-) GJ! 2 things would be usefu, i guess... a negative prompt and an optional lineart controlnet implemenation...

  • @mcdigitalargentina
    @mcdigitalargentina 5 дней назад

    Amigo gran trabajo! Suscripto a tu canal. Gracias por compartir tu trabajo.

  • @spelgenoegen7001
    @spelgenoegen7001 3 дня назад

    Awesome! Everything works perfectly with diffusion_pytorch-model_promax.safetensors. Thanks!

  • @วรายุทธชะชํา
    @วรายุทธชะชํา 5 дней назад +1

    I want to generate multiple sizes in one round. How can I do that? Sir

  • @deonix95
    @deonix95 4 дня назад +1

    Error occurred when executing CheckpointLoaderSimple:
    ERROR: Could not detect model type of: D:\Programs\SD\models/Stable-diffusion\flux1-dev-fp8.safetensors
    File "D:\Programs\ComfyUI_windows_portable\ComfyUI\execution.py", line 317, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    File "D:\Programs\ComfyUI_windows_portable\ComfyUI\execution.py", line 192, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    File "D:\Programs\ComfyUI_windows_portable\ComfyUI\execution.py", line 169, in _map_node_over_list
    process_inputs(input_dict, i)
    File "D:\Programs\ComfyUI_windows_portable\ComfyUI\execution.py", line 158, in process_inputs
    results.append(getattr(obj, func)(**inputs))
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    File "D:\Programs\ComfyUI_windows_portable\ComfyUI
    odes.py", line 539, in load_checkpoint
    out = comfy.sd.load_checkpoint_guess_config(ckpt_path, output_vae=True, output_clip=True, embedding_directory=folder_paths.get_folder_paths("embeddings"))
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    File "D:\Programs\ComfyUI_windows_portable\ComfyUI\comfy\sd.py", line 527, in load_checkpoint_guess_config
    raise RuntimeError("ERROR: Could not detect model type of: {}".format(ckpt_path))

  • @MatthewWaltersHello
    @MatthewWaltersHello День назад

    I find it makes the eyes look like googly-eyes. How to fix?

  • @DarioToledo
    @DarioToledo 7 дней назад

    I didn't know of that Union repaint controlnet. What does that do?

    • @my-ai-force
      @my-ai-force  7 дней назад

      It's used for inpainting.

    • @DarioToledo
      @DarioToledo 7 дней назад

      @@my-ai-force and what difference does it make to usual inpainting without a controlnet? Tried to run it but gave me errors.

  • @maelstromvideo09
    @maelstromvideo09 2 дня назад

    try differensial diffusion it make inpainting better, without most of this ass pain.