Remaking NOSFERATU with AI (Runway Gen-3 Alpha)

Поделиться
HTML-код
  • Опубликовано: 23 дек 2024

Комментарии •

  • @vidiioco
    @vidiioco 2 месяца назад +6

    Wow that looks great

  • @giri.goyo_yt
    @giri.goyo_yt 2 месяца назад +2

    Ha. Was waiting for this. When I get some more time, I want to re-do my old digital flick. Thanks for a great demo.

  • @urniurl
    @urniurl 2 месяца назад +1

    Looks great, potential is massive.

  • @cinematicambience
    @cinematicambience 2 месяца назад +4

    That looks so cinematic now

  • @wisdomwonder
    @wisdomwonder 2 месяца назад +5

    Regenerating seems like the right term

  • @xombiemike
    @xombiemike 2 месяца назад +1

    "Reimagining" I believe fits best.

  • @norgaardcarlos
    @norgaardcarlos 2 месяца назад +1

    cool exercise gabe

  • @MidnightTVYT
    @MidnightTVYT 2 месяца назад +2

    That actually looks scary now

  • @realalchemy
    @realalchemy 2 месяца назад

    Man the now is wild

  • @FilmSpook
    @FilmSpook 2 месяца назад

    VERY Cool experiment! I was doing the same with clips from the 1998 classic, Blade, but I stopped due to the lack of consistency, for Runway tends to keep changing the ethnicity of characters when using video-to-video, even sometimes when I put in specific prompts. Gen-3 was rushed and still needs a lot of work, but it's fun to play around with. MiniMax, however is the REAL game-changer here, not Runway, as it generally follows prompts extremely well, and it renders actions in real-time motion by default, whereas Runway and Kling are nearly 100% slow-motion, for even their "fast motion" is mostly just a faster slow-motion. MiniMax also renders human actions FAR better than most of its competitors and is 100% free (for now). Runway needs to step their game up, and FAST.