IPAdapter Plus styletransfer with Deforum in ComfyUI AI

Поделиться
HTML-код
  • Опубликовано: 22 авг 2024
  • In this video I will walk you through the basic usage of the IPAdapter Plus addon for COmfyUI and how you integrate it into a basic Deforum workflow.
    Be sure to update your ComfyUI installation , Deforum and IPAdapter Plus to the latest build.
    Please comment below if you have questions or want to tell me your suggestions for future videos.
    Last weeks video can be found here:
    • Cadence interpolation ...
    Here I explain how you install IPAdapter Plus:
    • IP-Adapter V2 / plus f...
    Get the needed custom nodes package:
    github.com/XmY...
    github.com/cub...
    Get the model:
    civitai.com/mo...
    How to use the scheduling with variables and math for translation and animation:
    github.com/def...
    The Deforum wiki:
    github.com/def...
    More infos on Deforum:
    deforum.art/
    / discord
    Support me on Patreon and get the workflow with the base membership:
    / neuron_ai
    Buy me a coffee:
    buymeacoffee.c...
    Connect with me:
    / _neuron_ai
    / _neuron_ai
    / neuron_ai
    The equipment that I use (affiliate links):
    GPU: ASUS ROG STRIX RTX 4090 amzn.to/42ptUTi
    #comfyui #stablediffusion #tutorial #animatediff #automatic1111 #aiart #ai #lama #tutorial #howto
  • НаукаНаука

Комментарии • 12

  • @AB-wf8ek
    @AB-wf8ek 25 дней назад

    Would love to see some examples of your results. It's good to put things like that at the beginning of your video to attract more viewers.
    Thanks for sharing!

  • @Radarhacke
    @Radarhacke Месяц назад

    Thank you, it works. Tipp for everybody: Convert Widget to input for Ipadapter weight and connect a Float Incrementer node to it. Now the loaded image is fading/morphing smoother. Play with the Incrementer Value and get good results. Sorry for my bad english.

  • @miklosnagy5451
    @miklosnagy5451 Месяц назад

    Let us know if you can think of any addition or improvement to the nodes! Thanks, mix

    • @neuron_ai
      @neuron_ai  Месяц назад

      hey mix, yes there are some things. I will let you know on discord. ;) awesome work!

  • @iozsoo
    @iozsoo Месяц назад

    Thank you very much!

  • @liangliang2268
    @liangliang2268 Месяц назад +1

    Is this possible to use a guide image as the first frame just like what we did in a1111? Thanks for your reply!

    • @neuron_ai
      @neuron_ai  Месяц назад

      Yes, you can do that. Simply use the init latent input from the deforum iterator node. you can load an inital image but have to VAE encode it to a latent and then fill it in. you can then play around with the strength_schedule setting to keep the inital image longer or shorter....

    • @Arkayruz
      @Arkayruz 8 дней назад

      @@neuron_ai I've been trying to generate deforum animations for days with 1 single image as a reference (as the other person mentioned) and for the life of me I have no idea how to do so. Even with what you just explain and I can't find anything on the internet on how to do it. Can you please elaborate?

  • @DerekShenk
    @DerekShenk 19 дней назад

    Would be much better if you show real video that demonstrates results, then show how to do it.

    • @neuron_ai
      @neuron_ai  19 дней назад

      I thouht most people know deforum and the results since there are so much videos about it. But for future videos I will start with results

    • @DerekShenk
      @DerekShenk 19 дней назад

      @@neuron_ai There are many Deforum videos and most of them are crap! They show jerky motion, demonstrate difficult processes, and unclear instructions. Absolutely you want to show an amazing finished clip at the start of your video to demonstrate the results you are going to help the viewer achieve. If they see something cool looking, they are going to watch your instructions! That is how you grow your channel. Good stuff!