Stanford CS109 Probability for Computer Scientists I Adding Random Variables I 2022 I Lecture 17

Поделиться
HTML-код
  • Опубликовано: 2 фев 2025

Комментарии • 15

  • @alexbui0609
    @alexbui0609 9 месяцев назад +4

    Francis Galton's quote at the end and students clapping. Wow a wonderful lecture again! I am speeding through this like watching Netflix. Chris Piech is an awesome professor!

  • @rolandlochli4492
    @rolandlochli4492 8 месяцев назад +3

    I watched every second from Lecture 1 to Lecture 17 in the past eight days. I cannot think of a better preperation for my Statistics final exam next month at my university in Germany. Many thanks to Chris and the TAs. What a wonderful course! Possibly the best around the globe.

    • @omaraymanbakr3664
      @omaraymanbakr3664 2 месяца назад

      I am curious how did your exam go ?

    • @rolandlochli4492
      @rolandlochli4492 2 месяца назад +1

      @@omaraymanbakr3664 Hey, it went pretty well (thank you for asking), it was quite easy to pass after doing two courses (one online and one in real-life) simultaneously. The most difficult exam problem was on Maximum Likelihood Estimation.
      In retrospect, I would do as many exercises with pen and paper as possible on top of attending class and watching online content. Doing exercises with pen and paper is the surest and quickest way to an A.

    • @omaraymanbakr3664
      @omaraymanbakr3664 2 месяца назад

      @@rolandlochli4492 well done .
      Out of curiosity again 😅 which course did you choose after this one

    • @rolandlochli4492
      @rolandlochli4492 2 месяца назад +1

      @@omaraymanbakr3664 Statistics was a prerequisite to take the mandatory course in Machine Learning at my local university. I study Artificial Intelligence and almost all of my courses are mandatory courses. This term, I also have courses in Computer Vision and Optimization (think of Gradient Descent) among others. Do you study Computer Science or something related to Mathematics/Statistics?

    • @omaraymanbakr3664
      @omaraymanbakr3664 2 месяца назад

      @@rolandlochli4492 I have written this reply so many times, but it keeps being deleted, and I don't know why.
      Anyway, I am also studying machine learning (my end goal is computer vision). I am taking a machine learning course right now, and I realized that ML is just statistics with optimization and some advanced linear algebra! So, I was looking for a statistics course to take after this one, but I only found this one from Mit [18.650 | Fall 2016 | Undergraduate]. What do you think? Is it a good choice to take after this one? I am self-studying the field, so ...

  • @ananth2006
    @ananth2006 Месяц назад +1

    Love each and every one of these lectures by Chris!

  • @MohammadrezaParsa-k7p
    @MohammadrezaParsa-k7p Год назад +3

    Every time that I think it’s couldn’t be better I just encounter with another amazing lecture by Chris 👌👍😎
    Hope to see more from him 🙏 (especially programming methodology !!)

    • @stanfordonline
      @stanfordonline  Год назад +1

      Awesome feedback, thanks for your comment!

  • @arastooajorian9069
    @arastooajorian9069 Месяц назад

    I think the convolution part uses wrong notation f(X+Y=a) = integral (f(X=a-y)f(Y=y)dy), because obviously here there are 3 functions f, one for random variable A=X+Y, one for X and the other Y. All of them are represented by f

    • @adityabahuguna6815
      @adityabahuguna6815 Месяц назад

      Because X+Y = A (a constant) , whatever value X takes, Y is A-X (or the other way)