Kullback-Leibler Divergence (KL Divergence) Part-1

Поделиться
HTML-код
  • Опубликовано: 21 дек 2024

Комментарии • 9

  • @LearnMeComputer
    @LearnMeComputer 6 лет назад

    Thanks Doctor for this amazing explanation but I want to know is DBN use this algorithm to calculate the similarity between the actual and predictive values to stop learning and navigate to the second RBM?

  • @vrfriends926
    @vrfriends926 6 лет назад

    Hi Niraj. This video is really helpful in understanding KL divergence.
    However I have a question, how does KL divergence work when we are trying to compare distribution of 10000 events with distribution of 1000 events ? Is that fair to do so ?

  • @sureshdasarwar7524
    @sureshdasarwar7524 7 лет назад

    whats the role of Epsilon and gamma here can you please explain here ?

    • @DrNirajRKumar
      @DrNirajRKumar  7 лет назад

      Thanks for you question. As, I remember.. I didn't use "Epsilon and gamma" to describe anything. OR If you give the time, where it is used, then it will be helpful for me to explain the scenario and answer the question.

    • @sureshdasarwar7524
      @sureshdasarwar7524 7 лет назад

      sure sir ! you can take a time and explain

    • @DrNirajRKumar
      @DrNirajRKumar  7 лет назад

      I mean to say that...I want to know the time interval in 20 min. Video..where, I used Epsilon and Gama?

    • @sureshdasarwar7524
      @sureshdasarwar7524 7 лет назад

      actually i wanted to know the use of gamma and epsilon

    • @DrNirajRKumar
      @DrNirajRKumar  7 лет назад

      I think your question is related to some other context (As, these terms are not used in the current video). So, please give the context details.

  • @surajborade6741
    @surajborade6741 6 лет назад +1

    work on your handwriting......