LivePortrait: Add emotions to your still images by animating the face in ComfyUI - Stable Diffusion

Поделиться
HTML-код
  • Опубликовано: 6 сен 2024

Комментарии • 14

  • @walidkh-sansfiltre
    @walidkh-sansfiltre Месяц назад

    Hello great thank you, how build a web app on top off comfuyi live portrait ?

    • @CodeCraftersCorner
      @CodeCraftersCorner  Месяц назад +1

      Hello, you can do the same as my previous videos on ComfyUI API. Save the workflow (json) file and load it in your app.

  • @sunlightlove1
    @sunlightlove1 Месяц назад

    alwaYS GREAT CONTRIBUTION

  • @CGzzzzzzzzzzz
    @CGzzzzzzzzzzz 26 дней назад

    Hey Hi i really find your videos nice. I am new to live portrait and seeing the github side there are many updates. So is this Video good start or is it old? And will u make a new Video to liveportrait eventually? I appreciate ur work 👍👍

    • @CodeCraftersCorner
      @CodeCraftersCorner  24 дня назад +1

      Hello, this video is still good for the version 1. There is a new version of LivePortrait now that may give better results.

  • @yngeneer
    @yngeneer Месяц назад

    did you, or do you plan to take a look at implementing live portrait in video to video workflow? is it even possible?

    • @CodeCraftersCorner
      @CodeCraftersCorner  Месяц назад +1

      Hello @yngeneer, LivePortrait primarily works by animating a single image, so integrating it directly into a video-to-video workflow might be challenging at the moment. The main purpose is to use video to drive the animation of the head. These projects are advancing really fast, so it’s possible in the near future. Motion capture is already widely used to animate 3D models in the film and gaming industries, soon it might be available for 2D video animations.

    • @CodeCraftersCorner
      @CodeCraftersCorner  Месяц назад +1

      Not sure why my previous reply is not showing.
      LivePortrait primarily works by animating a single image, so integrating it directly into a video-to-video workflow might be challenging. The main purpose is to use video to drive the animation of the head. Since these projects are moving so fast, it’s possible in the near future. Motion capture is already widely used to animate 3D models in the film and gaming industries, soon it might be available for 2D video animations.

    • @yngeneer
      @yngeneer Месяц назад

      @@CodeCraftersCorner ok, thank you

    • @CodeCraftersCorner
      @CodeCraftersCorner  Месяц назад

      👍

  • @vickyrajeev9821
    @vickyrajeev9821 Месяц назад

    Thanks, can I run on CPU because i don't have GPU

    • @CodeCraftersCorner
      @CodeCraftersCorner  Месяц назад

      Not sure! I checked the resources used during the generation. For me, it took about 2GB VRAM and CPU was at 100%. You can give it a try. It may work, although will be slow. Alternatively, you can try the Huggingface space. It is free for now and it should be fast.

  • @walidkh-sansfiltre
    @walidkh-sansfiltre Месяц назад

    Hello great thank you, how build a web app on top off comfuyi live portrait ?