Knowledge Distillation in Large Language Models

Поделиться
HTML-код
  • Опубликовано: 11 сен 2024

Комментарии • 4

  • @chigbundu
    @chigbundu 27 дней назад +2

    thank you. this really helped me grasp knowledge distillation; while i knew about it, i didn't understand how it was done. your explanation especially when you showed the maths behind it was really helpful.

    • @John.Olafenwa
      @John.Olafenwa  26 дней назад +1

      That’s great to hear. Thanks for the feedback

  • @chigoziesamuel4782
    @chigoziesamuel4782 17 дней назад +1

    This was really easy to follow. Thank you!
    If you could also make a video of example implementation of LLM Distillation, it will be really appropriate.

    • @John.Olafenwa
      @John.Olafenwa  16 дней назад

      Sounds like a great idea. Added to my todo!