Detecting Anomalies Using Statistical Distances | SciPy 2018 | Charles Masson

Поделиться
HTML-код
  • Опубликовано: 31 дек 2024

Комментарии • 17

  • @arshamafsardeir2692
    @arshamafsardeir2692 2 года назад +11

    The best explanation of Statistical Distances that I have found. Easy and nice explanation of Kolmogrov-Smirinov, Wasserstein distance, and KL-divergence.

  • @mathman2170
    @mathman2170 2 года назад +1

    Love it when a talk presents the material in a carefully developed, logical, manner. Merci!

  • @kimmupfumira3417
    @kimmupfumira3417 3 года назад +3

    Great explanation! Easy to digest.

  • @MauricioSalazare
    @MauricioSalazare 6 лет назад +9

    Well done! Nice explanation!

  • @minesinitiativesrussie1778
    @minesinitiativesrussie1778 5 лет назад +1

    T'es le meilleur fillot ! J'ai rien compris mais c'est quand même la classe !

  • @jamesmckeown4743
    @jamesmckeown4743 5 лет назад +6

    17:13 there should be a negative in the definition of KL

    • @Mayur7Garg
      @Mayur7Garg 3 года назад +1

      I think the negative should be based on whether you are minimizing or maximizing it. By definition, distances are always positive.

  • @nmertsch8725
    @nmertsch8725 5 лет назад +2

    This is a great presentation! Is there a reason why you did not commit the nth Wasserstein distance to SciPy?

    • @canmetan670
      @canmetan670 5 лет назад +1

      docs.scipy.org/doc/scipy/reference/generated/scipy.stats.wasserstein_distance.html
      As of this date, latest stable version of scipy is 1.3.1 on pip. This has been allegedly available after 1.0.0

  • @TheBjjninja
    @TheBjjninja 5 лет назад +9

    6:15 we should either reject or fail to reject H0 i believe. Instead of “accept H0”

    • @harry8175ritchie
      @harry8175ritchie 5 лет назад

      AKA accept. I think it depends on where you learn statistics. My professors always said accept and reject.

    • @mikhaeldito
      @mikhaeldito 4 года назад +6

      Semantically, "accepting H0" and "failing to reject H0" are the same. But they are not!
      P-value is a measure of the probability of our data assuming that the null hypothesis (such as no difference between two groups) is true. So, it is a measure against the null, not in favour of the null. This is why we have a statistical test of no difference, or similarity, that is called equivalence tests.

    • @joelwillis2043
      @joelwillis2043 3 года назад

      @@harry8175ritchie AKA NO. You can't conclude your assumption based on your assumption. This is like logic 101. HARD FAIL GO DRIVE A TRUCK FOR LIVING.

    • @harry8175ritchie
      @harry8175ritchie 3 года назад +2

      Not the way to handle it buddy.

  • @nomcognom2332
    @nomcognom2332 6 лет назад +2

    Good!

  • @112ffhgffg12
    @112ffhgffg12 2 года назад

    Thanks

  • @jesuse4691
    @jesuse4691 4 года назад

    G