Kullback-Leibler (KL) Divergence Mathematics Explained

Поделиться
HTML-код
  • Опубликовано: 7 сен 2024

Комментарии • 11

  • @datamlistic
    @datamlistic  7 месяцев назад +3

    Check out why we don't use the mean squared error (MSE) loss in classification here: ruclips.net/video/bNwI3IUOKyg/видео.html
    *Important Note* - At 0:12, I've mistaken written "Day 1" on both distributions. The correct labels are: "Day 1" on the left, "Day 2" on the right. (thanks @pouyasojoudi3585 for observing this!)

  • @RobertBakyayita
    @RobertBakyayita 3 месяца назад +1

    Thanks that was very helpful and clear explanation 🎉❤

  • @veraferrer
    @veraferrer 20 дней назад

    thank you

  • @cupq
    @cupq 7 месяцев назад +1

    Thank you for this video !

    • @datamlistic
      @datamlistic  7 месяцев назад

      Glad it was helpful! :)

  • @nicolo.lazzaro
    @nicolo.lazzaro 6 месяцев назад

    Great video as always

    • @datamlistic
      @datamlistic  6 месяцев назад

      Thanks! Glad you liked it! :)

  • @pouyasojoudi3585
    @pouyasojoudi3585 7 месяцев назад

    you wrote both as the first day.. is that correct??

    • @datamlistic
      @datamlistic  7 месяцев назад

      No, it's a mistake from my side. Sorry for any confusion this might have caused and thanks for the feedback!