MIT 6.S191: Evidential Deep Learning and Uncertainty

Поделиться
HTML-код
  • Опубликовано: 11 сен 2024
  • НаукаНаука

Комментарии • 59

  • @huiy.8767
    @huiy.8767 3 года назад +9

    Simply wonderful! The content is up to date and covers every aspect of Deep Learning I can think of. (I myself have been doing data mining teaching/research for the past decade, but my knowledge in deep learning needs constant updates.) I can't wait to learn from the yet-to- release lectures. Thank you!

  • @abdelrahmaneldesokey1762
    @abdelrahmaneldesokey1762 2 года назад +1

    I have worked and read many resources on Bayesian Deep Learning, but your presentation is far beyond anything that I have seen. Super clean, intuitive, and easy-to-follow.
    So Thank you :)

  • @blvc_izzy
    @blvc_izzy 3 года назад +5

    Just getting to know about this channel..
    I'm a complete beginner and I hope I can make use of this

  • @nintishia
    @nintishia 3 года назад +1

    Excellent coverage of the topic. However, a discussion of how easy or difficult it is to engage in evidential deep learning for real problems, fast algorithms as well as a discussion on computational complexity would be immensely useful.

  • @hamzamameche3893
    @hamzamameche3893 3 года назад +2

    This one is my personal favorite giving that I want to understand your latest ICRA paper, thank you.

    • @AAmini
      @AAmini  3 года назад +1

      Thanks!

  • @chanochbaranes6002
    @chanochbaranes6002 2 года назад +3

    Amazing lecture, I cant wait for 2022 videos.

  • @howardlo9040
    @howardlo9040 3 года назад +3

    Very high-quality presentation!

  • @parmoksha
    @parmoksha 2 года назад +1

    such a difficult content explain in easy to understand manner. To make highly complex thing very easy to understand you really need to have very in-depth understanding of subject and also need to understand students mentality ( thinking from students perceptive what they can understand and how to make explanation as simple as possible) . Both lecturer of this course really great in both of this two things

  • @sameerjadhav5603
    @sameerjadhav5603 Год назад +1

    Thank you for this power packed lecture. Literally every sentence was a knowledge for me!
    Where can find the slides of this lecture? I couldn't find slides on the link provided in description. If you provide me link to download the slides, that would be awesome. Thank you so much again!!

  • @Fordance100
    @Fordance100 3 года назад +1

    Concise and clear, great lecture. Thanks for making it available to general public.

  • @jing_li
    @jing_li 2 года назад

    There may be a typo in the slides, the paper by Sensoy was published in NeurIPS2018, not NeurIPS2019.

  • @notsure7132
    @notsure7132 3 года назад +4

    Thank you.

  • @gyeonghokim
    @gyeonghokim 3 месяца назад

    A great, clear lecture!

  • @gio55964
    @gio55964 Год назад +1

    very informative.thanks to Amini sir

  • @zstosvm1434
    @zstosvm1434 Год назад +2

    anyone has any idea how to reproduce Fig. S3 from the paper? I don't understand this part: "Rather than using the L1 error in the regularization term, as in previous experiments, we use regularize the standard score and estimate epistemic and aleatoric uncertainty (Fig. S3). " I think in the previous experiment, epistemic always higher than aleatoric uncertainty (see Fig 3), but magically it got sorted as shown in Fig S3.

  • @KeksZero
    @KeksZero 3 года назад +2

    Is there an introduction or a book and some github python code about evidential deep learning someone can share?
    Is there a difference between bayesian evidential learning and eveidential (deep) learning?

    • @AAmini
      @AAmini  3 года назад +3

      This might be helpful: github.com/aamini/evidential-deep-learning

  • @guilherme_viveiros
    @guilherme_viveiros 3 года назад +2

    Great lecture, very clear.

  • @harshkumaragarwal8326
    @harshkumaragarwal8326 3 года назад +1

    can already see how this is going to positively effect medical imaging :))

    • @shiv9582
      @shiv9582 2 года назад

      Evident Training se Doctor Bimari or uski Medicine sahi se Predict kr payege agar ye Deep Learning Tech hospital mai Use kre 👈🏼🤌🏼👌🏼

  • @raminbakhtiyari5429
    @raminbakhtiyari5429 3 года назад +6

    hi. thanks for this great lecture.
    can I get the implementation code of this algorithm?

    • @AAmini
      @AAmini  3 года назад +8

      Yes! Here you go: github.com/aamini/evidential-deep-learning/

  • @user-mg7el8mf3p
    @user-mg7el8mf3p 3 года назад

    There is a mistake at 9:15. It should allow for 'bases different from e' by putting (e^(beta*z)).

  • @ASHISHDHIMAN1610
    @ASHISHDHIMAN1610 Год назад +1

    i am working on a variation of deep evidence regression, and love the intuition in this lecture !

  • @sachchitkholkute7180
    @sachchitkholkute7180 Год назад +1

    Wonderful lecture! Thanks 👍

  • @lichunli3836
    @lichunli3836 2 года назад +1

    Great lecture!

  • @AhmedSALAH-bb7un
    @AhmedSALAH-bb7un 8 месяцев назад

    Exceptional!!

  • @khuongnguyenduy2156
    @khuongnguyenduy2156 3 года назад +1

    Great lecture!Thank you very much!

  • @sagunshakya2579
    @sagunshakya2579 2 года назад

    Loved every bit of the lecture! :)

  • @pipeescallon
    @pipeescallon 3 года назад +1

    Just great!

  • @Aikman94
    @Aikman94 3 года назад +3

    Is this possible for time series analysis?

    • @AAmini
      @AAmini  3 года назад +1

      Definitely!! Evidential layers can be placed at the end of an LSTM (for example) to model the uncertainty at each timestep (for many-to-many problems) or at the final timestep (for many-to-one problems).

    • @Aikman94
      @Aikman94 3 года назад

      @@AAmini so basically I transform my tiempo series to a supervised learning problem (x,y) to run this algorithm?

    • @AAmini
      @AAmini  3 года назад +1

      Exactly, for any supervised problem (e.g., trained with MSE loss for regression or cross entropy for classification) it should be a simple drop-in replacement to use evidential layer/loss instead. If you don't have a supervised learning problem, you can also obtain uncertainty estimates using sampling based techniques and treating your model as a Bayesian NN (e.g., by using Monte Carlo Dropout sampling).

    • @Aikman94
      @Aikman94 3 года назад +1

      @@AAmini And the return of this algorithm is the mean, variance as well as the prediction of my time series, right?

  • @mickaelslomka4629
    @mickaelslomka4629 3 года назад

    Hello, I don’t get exactly how you calculate the higher moment. Do you output E(X^n) on top of E(X) from a deterministic RNN ? Thanks a lot for the really good video

  • @nidajong3038
    @nidajong3038 3 года назад

    It's excellent lecture. Thank you so much.

  • @siddharthshrivastava5823
    @siddharthshrivastava5823 3 года назад

    Woow..Awesome Lecture!

  • @haniyek7811
    @haniyek7811 2 года назад

    Thanks, great lecture!

  • @nileshramgolam2908
    @nileshramgolam2908 Год назад

    Hi what is the constraints of alpha,beta,lamda and gamma?

  • @daikishimizu6800
    @daikishimizu6800 3 года назад

    Is f(X) of epistemic variance f(x|w_t) ? Also, f(x|w) means y?

  • @tommgn2664
    @tommgn2664 2 года назад

    Hi, thanks for this really interesting content !
    I was wondering how uncertainty (aleatoric and epistemic) can be related with the Bias/Variance Decomposition :
    _Expected Test Error = Variance + Bias² + Error_
    I would say that :
    (i) *aleatoric uncertainty = Error* (i.e the inherent noise in the training dataset, due to noisy data or wrong labels)
    (ii) *epistemic uncertainty = Variance* (i.e "how far my prediction - obtained with one particular training dataset - is from the average model obtained in theory with all possible data")
    (iii) *Bias is independent from uncertainty*
    Not sure about (iii) ... Actually, not sure about everything !
    Would be nice to have your opinion on that ! =) Thanks a lot.

  • @creativeuser9086
    @creativeuser9086 Год назад +1

    No way there’s anyone who was able to understand the “evidential learning” part without looking into other videos or having prior knowledge.

  • @alexroberts3566
    @alexroberts3566 3 года назад +1

    The more I think about this, the more I think this is not a good idea. You're asking the computer who other than cats and dogs hasn't seen anything to tell you how sure it is what it is looking at. But there are many cases where something will look more like a cat or more like a dog, for example a city with two houses that look like cat ears. My point is it may say it's very sure it's more like a cat but it can't tell us if it's a cat because it didn't learn what a cat is, it only learnt how a cat is different from a dog.

    • @shiv9582
      @shiv9582 2 года назад

      Evidence Training is there to overcome that Problem
      So that System can take Evidence from Outside world and Learn it from themselves

  • @Kenspectacle
    @Kenspectacle 2 года назад

    Hello, I have a question, what is ~ in this formula for example y ~ Normal(miu, sigma^2)?
    Thank you for the nice lecture! :)

    • @shiv9582
      @shiv9582 2 года назад

      ~ 👈🏼this sign means its almost Equal to
      = 👈🏼 this is Equal to Sign
      ~ 👈🏼this is ALMOST(about) Equal to sign

  • @NeerajSharma-yf4ih
    @NeerajSharma-yf4ih 3 года назад

    Great....

  • @hedada-d4v
    @hedada-d4v 2 года назад

    可以在标题上标明第几集吗?

  • @AbhishekSinghSambyal
    @AbhishekSinghSambyal 2 года назад

    This lecture slides are not avilable on the website.

    • @AAmini
      @AAmini  2 года назад

      Please use the archived link. All content from past years is still archived online. Thanks!

    • @AbhishekSinghSambyal
      @AbhishekSinghSambyal 2 года назад

      @@AAmini Thanks!

  • @omkarsatapathy8209
    @omkarsatapathy8209 3 года назад +1

    Sir, I have watched first 2 lecture. I am struggling to generate code and models. Kindly help me with practicals 😌

    • @ShubhamSinghYoutube
      @ShubhamSinghYoutube 3 года назад

      Lecture on Recurrent Networks and CNNs ?

    • @shiv9582
      @shiv9582 2 года назад

      Bhai bina Coding k IT job mil sakti hai kya???
      Mujhe Coding bilkul Nahi pasand

  • @justaqmal2379
    @justaqmal2379 3 года назад

    is this lecture for bachelor degree? or magister?