The Battle of Polynomials | Towards Bayesian Regression

Поделиться
HTML-код
  • Опубликовано: 22 авг 2024

Комментарии • 43

  • @bharadwajreddy7840
    @bharadwajreddy7840 3 года назад +13

    This is an amazing series, please continue doing videos for rest of "pattern recognition and machine learning" book

    • @KapilSachdeva
      @KapilSachdeva  3 года назад +3

      Will do.

    • @KapilSachdeva
      @KapilSachdeva  3 года назад +2

      Feeling guilty now :( …. Apologies for the delay.

    • @KaranYadav-hw9yo
      @KaranYadav-hw9yo 3 года назад +2

      Yeah..this is really helpful...I was struggling to get sense of these equations but then this video has very systematically explained the concepts!!!

    • @KapilSachdeva
      @KapilSachdeva  3 года назад

      @@KaranYadav-hw9yo thanks 🙏… glad you found it helpful!

  • @adesiph.d.journal461
    @adesiph.d.journal461 3 года назад +8

    Please do more from PRML! A whole generation of us need it :) Thank you!

    • @KapilSachdeva
      @KapilSachdeva  3 года назад

      Do you mean beyond Bayesian Regression and Inference? … I am trying to base most of the tutorials in this series using examples and plots from PRML provided it makes sense. For example, the latest (Markov Chains) in PRML is not covered to my satisfaction.

    • @adesiph.d.journal461
      @adesiph.d.journal461 3 года назад +2

      @@KapilSachdeva So here is how I see it, A lot of Advanced ML courses abroad require you to have prerequisites in the form of PRML or the MML book. At least if someone wants to understand research papers a lot of them assume students know chapters from PRML like Kernel Methods and Approximate Inference (which become important in Generative models, VAEs). Your videos are super nice so if you are working on a series on these lines I believe undergraduates can start understanding the subject in more depth and also start contributing and try their hand in research. Definitely, if you feel find other books to make more sense for a particular topic that is the way forward, but its just the confidence it gives students knowing they covered the prerequisite book with the reputation of PRML feels good.

    • @KapilSachdeva
      @KapilSachdeva  3 года назад +1

      All fair points. Thanks, your suggestion makes lot of sense and see the value in it. Let me ponder over it on how to execute on this great suggestion ! 🙏

    • @adesiph.d.journal461
      @adesiph.d.journal461 3 года назад +1

      @@KapilSachdeva Thank you! Really looking forward!

    • @KapilSachdeva
      @KapilSachdeva  3 года назад

      🙏

  • @Pruthvikajaykumar
    @Pruthvikajaykumar 2 года назад +4

    This channel is gold mine seriously. Thank you Sir!

  • @johnjohnston3815
    @johnjohnston3815 5 дней назад

    I just found your channel today and I am glad I did! Great stuff

  • @mathewjones8891
    @mathewjones8891 5 месяцев назад +2

    This was a wonderful tutorial. Your graphics, style, clarity and humour are very helpful. Looking forward to the rest of the series. Specifically, I'm interested in using Bayesian prediction on my own data.

  • @gender121
    @gender121 Год назад +3

    Really a treasure of knowledge expressed in very simple way….a help to learn Bayesian treatment.
    Please complete the series and more videos are expected.

  • @hariharanvenkatraman3074
    @hariharanvenkatraman3074 10 месяцев назад +1

    such a gem of the playlist. Every topics are explained clearly. Please do more videos with the same book❤❤❤

  • @zgbjnnw9306
    @zgbjnnw9306 2 года назад +1

    This video is so helpful for me after I finished my regression class. I didn't under the overfitting concept. Now I do! Thank you!

  • @sanjaythorat
    @sanjaythorat Год назад +1

    Very well explained. Please keep up the good work!

  • @Dev-zr8si
    @Dev-zr8si 2 года назад +1

    This is amazing, I hope you make more. Thank you.

    • @KapilSachdeva
      @KapilSachdeva  2 года назад +1

      🙏… there are 10 more videos in this playlist called “Towards Bayesian Regression”

  • @drewburns7805
    @drewburns7805 Год назад +4

    Thank you. Very helpful! But I think there is a mistake in saying the weights vector is a row vector. I believe it should be a column vector for t = Xw

    • @KapilSachdeva
      @KapilSachdeva  Год назад

      🙏 thanks for pointing it out. You are correct. I will add a note in the description.

  • @sertacderya3775
    @sertacderya3775 Год назад +2

    Very helpful videos, thank you very much!

  • @unimatorx
    @unimatorx Год назад +1

    That is super useful! Fantastic work :)

  • @cerioscha
    @cerioscha Год назад +1

    Great video, thanks !

  • @aarjavik123
    @aarjavik123 3 года назад +1

    Thanks for excellent lecture

  • @medihazukic5382
    @medihazukic5382 4 месяца назад

    Amazing lectures, thank you so much! I was wondering at 12:07 the partial derivatives of E wrt w_i (in the right panel), shouldn't those be the partial derivatives of ys and not Es?

    • @KapilSachdeva
      @KapilSachdeva  4 месяца назад

      No it is E, the error function. Our goal is to minimize the error.

  • @ankurgupta8060
    @ankurgupta8060 2 года назад +1

    How to create this type of video. Just curious its spot on with black background and animation is enough to serve the purpose. Any lead is appreciated 👍. Thanks for the video and the content.

    • @KapilSachdeva
      @KapilSachdeva  2 года назад +1

      🙏 PowerPoint with black theme and morph transitions and animations. Not much more than that.

    • @KapilSachdeva
      @KapilSachdeva  Год назад

      @@hyperadapted sometimes in latex but mostly in PowerPoint :)