Bayesian Neural Network | Deep Learning

Поделиться
HTML-код
  • Опубликовано: 22 окт 2024

Комментарии • 65

  • @sibyjoseplathottam4828
    @sibyjoseplathottam4828 2 года назад +18

    This is one of the best explanations I have seen on Bayesian neural networks. Thanks!

  • @M94-24
    @M94-24 21 день назад

    Only two minutes in, and I can already say with confidence that this is the best explanation of B-CNN I've ever seen. Thanks a lot!

  • @waleedkhan8590
    @waleedkhan8590 6 месяцев назад +2

    One of the best and most concise descriptions to BNNs for newcomers such as myself

  • @arjunroyihrpa
    @arjunroyihrpa 3 года назад +3

    u explained the bayesian nn in the easiest possible way that i can think of... excellent work

  • @my7username7is
    @my7username7is 2 года назад +6

    Great explanation. Simple and to the point

  • @softerseltzer
    @softerseltzer 3 года назад +5

    I enjoy your videos, and I have a suggestion to improve the audio quality. You should build or buy a pop filter to put in front of the microphone to eliminate the puffing sounds. This way you can talk even closer to the microphone and the audio will improve.

    • @TwinEdProductions
      @TwinEdProductions  3 года назад

      Thanks for the suggestion - the puffing sound has been a real pain! We'll check out the pop filter.

  • @Must23
    @Must23 2 года назад +1

    Top best ever explanation, side by side thus you know how different it is from standard NN

  • @anantsharma7955
    @anantsharma7955 3 года назад +2

    Hi, sorry I don't know much about what was talked about here in the video but I found it intriguing. I will be entering college soon and am deciding on my major. Is this statistics? Or Computer Science? Or mathematics? Felt like a combination of all of them (all 3 fields are very much overlapping too).

    • @TwinEdProductions
      @TwinEdProductions  3 года назад +4

      If I had to choose one name, I would call it machine learning (which is a subset of artificial intelligence): mathematics are the fundamental tools, statistics offers analytical frameworks and computer science is a whole load of fun things :). But practically speaking, you will come across neural networks at college if you take a machine/deep learning course as part of a computer science / information engineering degree. However, you can find lots of excellent online resources to introduce you to the world of machine learning before you go to college.

  • @makamsidhura
    @makamsidhura 2 года назад

    Best explanation on Bayesian neural nets

  • @andreymanoshin2202
    @andreymanoshin2202 3 года назад +2

    Yep, I finally understood it. Thanks!

  • @Trubripes
    @Trubripes 5 месяцев назад

    How do you backprop in Baysian Neural Networks ?

  • @canalfishing4622
    @canalfishing4622 3 года назад +2

    This is really helpful. So the ensemble is kind of a simplified bayesian ?

    • @TwinEdProductions
      @TwinEdProductions  3 года назад +1

      Thanks! Yes you're correct - averaging an ensemble of models can be viewed as assigning equal importance to each model seed rather than weighing by the posterior distribution of weights.

    • @canalfishing4622
      @canalfishing4622 3 года назад +1

      @@TwinEdProductions there are quite a few Bayesian package out there, like pymc3, gpytorch... it would be great if you have a tutorial to compare the features and show some hands on applications for these packages. And show the benefit of Bayesian. I’m assuming the gpytorch model will require more computing power. Thanks again.

    • @TwinEdProductions
      @TwinEdProductions  3 года назад

      @Canal fishing Thank you for the suggestion - we will add this to our ToDo list of videos! I'm personally quite a fan of BLiTZ as a Bayesian Neural Network library for PyTorch while I believe gpytorch is a library for Gaussian Processes.

    • @canalfishing4622
      @canalfishing4622 3 года назад +1

      @@TwinEdProductions you are right, I will check it out, thank you.

  • @mohdata100
    @mohdata100 3 месяца назад

    Thanks a lot. Can you share some of your published papers that I can go through.

  • @MLDawn
    @MLDawn 2 года назад

    Lovely video. One small point: KL divergence is not a distance measure! (KL (a||b) != KL (b||a)). Cheers

    • @TwinEdProductions
      @TwinEdProductions  2 года назад +1

      Thanks! Agreed, KL is not strictly a distance measure but helps intuition to think of it as one :)

  • @rajarajankirubanandan3154
    @rajarajankirubanandan3154 2 месяца назад

    Cool video

  • @RangoKe
    @RangoKe 7 месяцев назад

    Awesome video!! Nice and clear explanation! It would be perfect if the recording equipment was better👏

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w 2 года назад +1

    This is excellent explanation.

  • @phoenix2718Utube
    @phoenix2718Utube 9 месяцев назад

    Wow this is way better than blog posts!!

  • @SamiaToor
    @SamiaToor Год назад +1

    Can we use them
    For regression problems?

    • @TwinEdProductions
      @TwinEdProductions  Год назад

      Yes! In fact that is a very common application of bayesian neural networks

    • @SamiaToor
      @SamiaToor Год назад

      @@TwinEdProductions but in my training the RMSE values is really high after I am testing the the trained network using bnn. I am predicting it for 500 samples to get mean but the error is high

    • @TwinEdProductions
      @TwinEdProductions  11 месяцев назад

      @@SamiaToor does training work effectively when using a standard neural network for regression? And what is your motivation for using a BNN in your use case?

  • @faatemehch96
    @faatemehch96 3 года назад +2

    its a really helpful video , thanks a lot

  • @walterreuther1779
    @walterreuther1779 2 года назад +4

    3:55 Here my years of college crept in asking: Why is it equivalent? Doesn't this assume a particular loss function?
    I'm not quite sure - and perhaps the question is banal.
    But thank you very much, the video is incredibly helpful!
    Edit: Sorry I didn't pay attention... you mentioned that only some Loss-functions adhere to this criterion ^^
    It is so satisfying to feel that those years of statistics finally pay off.
    Thank you very much!

    • @TwinEdProductions
      @TwinEdProductions  2 года назад +3

      Hi Walter, I must say you have asked the question which had bugged me for a couple of years as there seemed to be a mismatch between practical deep learning and the theoretical optimisation problem. And for some reason a lot of deep learning practitioners are not even aware of the link between the theoretical optimisation problem and the practical loss functions such as cross entropy loss. So, I have typed up a proof for both regression and classification to explain from first principles why we use cross-entropy loss. Here is the link to it: drive.google.com/file/d/1EJF1vlwD3QdyEAcrFc26JMxs4LgRYOnR/view?usp=sharing

    • @TwinEdProductions
      @TwinEdProductions  2 года назад +1

      Oh and also, the question is not banal at all!

  • @excursion5246
    @excursion5246 2 года назад

    Can you share the example? for a BNN regression problem. I need to make a BNN for 06 inputs and 1 output problem. thanks

    • @TwinEdProductions
      @TwinEdProductions  2 года назад +1

      Hi, if you are familiar with building typical neural networks in pytorch, then the BLiTZ library is a very straigtforward way to build a BNN for any equivalent network that you are interested in such as regression. The link of a tutorial introducing the library is here: towardsdatascience.com/blitz-a-bayesian-neural-network-library-for-pytorch-82f9998916c7

  • @ConcreteJungle95
    @ConcreteJungle95 3 года назад

    Thank you very much, great video

  • @liorkissos9303
    @liorkissos9303 Год назад

    Very well explained!

  • @chetansharma4748
    @chetansharma4748 3 года назад +1

    really helpful

  • @Globalian001
    @Globalian001 2 года назад

    Thank you for a great video! any references?

    • @TwinEdProductions
      @TwinEdProductions  2 года назад

      Thanks for the support! Hard to pinpoint a single reference but probably mostly the material I had learnt during the 'advanced machine learning' course at my university.

  • @1UniverseGames
    @1UniverseGames 2 года назад

    What is mean aggregator and burnin in deep learning? Any chance to explain this.

    • @TwinEdProductions
      @TwinEdProductions  2 года назад

      Hi Jahid, would you be able to reference the context in which you have come across the terms 'mean aggregator' and 'burning' as they can have different interpretations within different applications. For example, for 'burning' are you referring to 'burn-in' time in the development of statistical systems?

  • @jmdvinodjmd
    @jmdvinodjmd 3 года назад +1

    wonderful!

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w 2 года назад +1

    A future video idea. A python example of above.

    • @TwinEdProductions
      @TwinEdProductions  2 года назад

      Thanks for the idea! Yes, a python example from scratch would be nice

  • @fujiang9920
    @fujiang9920 3 дня назад

    love from wuhan

  • @ThePeterDislikeShow
    @ThePeterDislikeShow 3 года назад

    Wow soon we'll get machine-generated tickets in the mail for going out without a mask!

  • @hgmarques
    @hgmarques 2 года назад

    Bayesian integration estimate

  • @dr.merlot1532
    @dr.merlot1532 3 года назад

    Provide code or it didn't happen.

    • @TwinEdProductions
      @TwinEdProductions  3 года назад

      Hi! The video here is simply explaining the theoretical concept of Bayesian neural networks. If you're interested in the implementation side, simply check out great packages like BLiTZ (A Bayesian neural network library for pytorch).