AI models collapse when trained on recursively generated data | Nature | Research paper review

Поделиться
HTML-код
  • Опубликовано: 10 сен 2024
  • AI model collapse (Nature Paper): www.nature.com...
    Self-Consuming Generative Models Go MAD: arxiv.org/pdf/...

Комментарии • 6

  • @NoobCoder14
    @NoobCoder14 28 дней назад

    Btw thank you sir for this video, I highly appreciate that people are creating videos based on research papers and studies 👍👍

  • @ChenchuReddy
    @ChenchuReddy Месяц назад +2

    Thank you Vizuara for this quality content!!

  • @dibyajyotiacharya8916
    @dibyajyotiacharya8916 Месяц назад +1

    Very interesting, pls keep making explanation videos covering such research papers. I am looking forward to it.

  • @studyaccount7682
    @studyaccount7682 Месяц назад +1

    Thankyou Siddhart , this was very informative.

  • @NoobCoder14
    @NoobCoder14 28 дней назад

    Digital incest terminology 😅 seems correct for this phenomenon.
    Btw if this goes on it will be hard to train models in future and how can someone determine if he/she has original data.

    • @vizuara
      @vizuara  27 дней назад

      The only way is to invest as many resources we do in training and evaluating LLMs, in building LLM plagiarism tools