Dataset Recaption & Mutation, plus 25 new Lora Releases

Поделиться
HTML-код
  • Опубликовано: 21 окт 2024

Комментарии • 3

  • @WhySoBroke
    @WhySoBroke 26 дней назад +1

    Very interesting and creative approach. Nicely done amigo!!❤️🇲🇽❤️

  • @Vigilence
    @Vigilence 28 дней назад +1

    I’m captioning artwork and noticed it’s around 1500 characters. Does flux actually take that amount for captioning?

    • @FiveBelowFiveUK
      @FiveBelowFiveUK  18 дней назад +1

      I tested this specifically with some lora versions. We trained the same dataset on chars vs token limit. Prompts can be longer than training captions it seemed. While 1500 is on the edge of the limit, it can work depending on how those words are tokenized. In the end although more details can be trained with longer captions, being brief prevents attention wandering, while increasing transformation elasticity.