Markov Chains: n-step Transition Matrix | Part - 3

Поделиться
HTML-код
  • Опубликовано: 26 ноя 2024

Комментарии • 141

  • @Garrick645
    @Garrick645 3 месяца назад +8

    Bro dropped the coldest playlist on Markov Chains and thought we wouldn't notice 🥶 . Sir 🙏🙏🙇‍♂🙇‍♂.
    These animation combined videos just make you understand 30 mins topic in 10 mins.

  • @Teamsoakbeans
    @Teamsoakbeans 4 года назад +126

    This is a very underrated channel! Fantastic visuals and explanations. Thank you for all that you do!

    • @NormalizedNerd
      @NormalizedNerd  4 года назад +4

      Thanks man! Do share the video if you can ❤️

    • @facundofelix5918
      @facundofelix5918 3 года назад

      So true

    • @sachinrajpandey5242
      @sachinrajpandey5242 3 года назад

      How we can know the value of r? If possible make a separate video on Chapman Kolmogorov process proof.

    • @Rajlakshmank
      @Rajlakshmank 3 года назад

      I totally agree.

    • @djgyanzz
      @djgyanzz 2 года назад

      Absolutely true

  • @tasnimmeem1158
    @tasnimmeem1158 6 месяцев назад +6

    You speak in such a clear manner, it's truly amazing. People forget how significant part it is of teaching. Thank you.

  • @karannchew2534
    @karannchew2534 2 года назад +13

    Notes for my future review.
    05:06
    *Chanpman-Kolmogorov Theorem*
    can be used to calculate the probability of going from state i to state j after n steps.
    05:06 = Probability of going from state i to state j in n steps with an intermiadte stop in state k after r steps, then sum over all possible value of k.
    Probability for going from state i to state j after n step
    = Probililty of going from state i into a in-between state * Probability of going from the in-between states to j state, for all the possible in-between state
    = ( Proability of going from state i to k after r steps * Probaility of going from k to n after n-r steps ).sum_for_all_k_states
    In other words: Get the probability of going from state-i to state-j in n step, through a state k. Do it for all possible state k. Then sum all the probabilities.
    But what is r? What is the value of r?
    The system reaches an intermidiate state k after r steps. There are different possible k states, so for each k, the r (the number of step to reach from i to k) might be different.
    The exact value of r is not important, except that probability of k to j is for n-r steps.

  • @barryward9729
    @barryward9729 Год назад +4

    I've only just discovered this channel and watched the Markov Chains series. They're excellent. They explained the concept and practice clearly and relatively simply. I'll definitely be coming back to see what else is available here. It's a great resource.

  • @ericrogers6099
    @ericrogers6099 Год назад +13

    I am so glad I stumbled upon this channel. Concepts are extremely clear and these videos have definitely helped me out so far. Thank you!

  • @lenaxdeng3202
    @lenaxdeng3202 Год назад +4

    Thank you so much for the informative video. I am enrolled in a machine learning course with no background in calculus or linear algebra and this video is literally a life saver

  • @maroskusnir7393
    @maroskusnir7393 3 года назад +2

    Video 10/10
    (animations, sound quality, content, focused and on point)
    Thanks!

  • @VijayKumar-bt1bb
    @VijayKumar-bt1bb 3 года назад +7

    Single word is enough " Awesome" . Cleared All doubts I had. Please make more videos on AIML series.

  • @enronenron2411
    @enronenron2411 5 месяцев назад

    I wasn't convinced of the quality of the vids, but after this, the channel deservers a subscribe.

  • @ahmadnadeem870
    @ahmadnadeem870 3 года назад +2

    Not in contact with books/studies atleast from13 years, but your power of illustration is amazing enough that I am understanding what's is going on.👍👍

  • @moumitasaha581
    @moumitasaha581 2 года назад +6

    Your presentation and teaching quality gives a positive vibe specially your voice. Moreover great to see an Indian teaching in such a beautiful way. Highly recommended channel who wish to learn advance probability theory. Have a great smoothness in your teaching. Thank you Normalizing Nerd

  • @sakethvns
    @sakethvns 4 года назад +32

    Please reduce the bell sound in the beginning. It is too high as compared to the voice and at that moment we are getting a earful

    • @NormalizedNerd
      @NormalizedNerd  4 года назад +6

      Thanks for the feedback

    • @ahmadnadeem870
      @ahmadnadeem870 3 года назад +1

      And on the other hand voice while illustration is too low

  • @platoho1245
    @platoho1245 2 года назад +2

    It has been so intuitive with the visualization and the explanation of your teaching! Thanks and keep up the good work!

  • @soumodeepsen3448
    @soumodeepsen3448 Год назад

    this is just SUPER COOL explanation of MARKOV MODEL i have ever seen!!!!!!!!!! Thanks so much man!

  • @johnmandrake8829
    @johnmandrake8829 4 года назад +8

    ah yes! I've been waiting for this! Fantastic. Thank you so much!

  • @mayukhmalidas
    @mayukhmalidas 3 года назад +6

    These videos are gold !!!!!!

  • @daniel_960_
    @daniel_960_ 2 года назад +1

    If there was a video like yours for every topic I have to learn at university I wouldn't be so frustrated with learning.

  • @clintonpambayi893
    @clintonpambayi893 Год назад

    You are the best in the game.

  • @audryk.7825
    @audryk.7825 3 года назад +3

    Thank you so much for the videos. You explained Markov Chain a lot better than my professor. You have mentioned aperiodicity in this video but you have not talked about it in later videos.

  • @chetangiradkar
    @chetangiradkar 2 года назад

    Channels like these made me think "Math is Wonderful"!

  • @jakestewart8784
    @jakestewart8784 Год назад

    Outstanding series of explanations

  • @saeedeh_nikan
    @saeedeh_nikan 5 месяцев назад

    intuitive example and clear explanation thank you

  • @ronabramovich56
    @ronabramovich56 9 месяцев назад

    Amazing Content :) For any who wondered like my why getting from 0 to 2 through 1 in 2 steps is the product of getting from 0 to 1 with 1 to 2, it follows from the Chapman Kolomogorov Thm explained around 5:54 :)

  • @mohamadalthafshaikh7159
    @mohamadalthafshaikh7159 Год назад

    This is the best HMM series in youtube. Crystal clear explanation and the music in between the video is just superb... If you can send me the link to the music would be great. Thanks in advance.

  • @gak500
    @gak500 3 года назад +1

    this is so helpful. I'll be really excited when the next video comes out. Top quality.

  • @prateekdhingra292
    @prateekdhingra292 3 года назад +1

    Amazing video bro! Thanks, from Delhi.

  • @lucy4906
    @lucy4906 3 года назад

    This is very badly explained by my own teacher, you save me thank you! Please keep doing what you do

  • @jasonleo6582
    @jasonleo6582 26 дней назад

    love from China, good video

  • @anaibrahim4361
    @anaibrahim4361 3 года назад

    You deserve the likes bro
    Thanks

  • @elenasmirnova2623
    @elenasmirnova2623 2 года назад

    Great explanations! Thank you for making it so followable.

  • @sanjaykrish8719
    @sanjaykrish8719 4 года назад

    Such a simple and clear explanation of a complicated concept.. Fantastic.

  • @raphael_silva
    @raphael_silva 2 года назад

    Beautifully explained!

  • @niklas4999
    @niklas4999 2 года назад

    This is a work of art!

  • @rmb706
    @rmb706 2 года назад

    Thank you. I am doing a course in probability models and I have never seen Markov Chains. The textbook examples aren't the most intuitive so it has been a bit frustrating. This helps a lot.

  • @coopermaira
    @coopermaira 2 года назад

    Awesome series of videos, much appreciated

  • @chinmayrath8494
    @chinmayrath8494 Год назад

    thanks a lot. you are a blessing. waiting for the aperiodicity video

  • @whiteheadiceprince1506
    @whiteheadiceprince1506 6 месяцев назад +1

    People from the future says thanks man

  • @cesarec88
    @cesarec88 2 года назад

    This video is incredibly clear! Thank you!

  • @kuljeetkaur
    @kuljeetkaur 2 года назад

    Very nicely explained. Thank you

  • @huyvuquang2041
    @huyvuquang2041 2 года назад

    Thanks chad, your videos really help me out. Keep it up bro...

  • @andresroca9736
    @andresroca9736 3 года назад +1

    Loved your narrative. I agree everything was Cool and Elegant 😎. Great video series on Markov. Gracias amigo 🤙.

  • @markovwallenstein9357
    @markovwallenstein9357 3 года назад

    looking forward to another video about aperiodicity!

  • @krishnateja9961
    @krishnateja9961 2 года назад

    awesome presentation.....
    Hope to see more content form you.....
    Good going !!
    You really normalise a nerd !!!

  • @aryakadam7892
    @aryakadam7892 8 месяцев назад +1

    this was great

  • @shayanitami3288
    @shayanitami3288 2 года назад

    You earned my subscription. great videos till part 3

    • @shayanitami3288
      @shayanitami3288 2 года назад

      Let's see if you continue the great work in the next parts

  • @katarinaspackova6975
    @katarinaspackova6975 2 года назад

    Amazing video! Thank you!

  • @stydras3380
    @stydras3380 4 года назад +1

    Hello, future person here. I really like your videos regarding markov chains :) I was wondering if it be possible to also get the mathematical definitions that go along with the videos. Will you also be covering markov decision programs and go into detail regarding optimal value function/policy? It would also be really interesting to have a parallel series on continuous state (and continuous action if we are talking about decision programs) markov chains as i dont know any good sources who cover them!

    • @NormalizedNerd
      @NormalizedNerd  4 года назад +2

      Humm interesting...Markov decision process and it's use in RL is a broad topic. Currently, I don't possess a very good understanding of them. Maybe someday I will.

    • @stydras3380
      @stydras3380 4 года назад

      @@NormalizedNerd Same here, so I was hoping you do :P Anyhow, Im eager for the next video!

  • @HusamuddineIsmail
    @HusamuddineIsmail 2 года назад

    Very explicative and informative, concise and to the point.
    Please add more videos about the applications of Chapman-Kolmogorov in variant applied fields.

  • @rezaerabbi2492
    @rezaerabbi2492 3 года назад +2

    What if we need to find 3 step transition probabilities like P11(3).Then how should we proceed using the Chapman-Kolmogorov Theorem?

    • @NormalizedNerd
      @NormalizedNerd  3 года назад +3

      @
      Itachi Uchiha
      Assuming num of states = 3
      Suppose you have taken r=1 so,
      P11(3) = P11(1) * P11(2) + P12(1) * P21(2) + P13(1) * P31(2)
      Again applying CK theorem on 2 step transitions,
      P11(2) = P11(1) * P11(1) + P12(1) * P21(1) + P13(1) * P31(1)
      P21(2) = P21(1) * P11(1) + P22(1) * P21(1) + P23(1) * P31(1)
      P31(2) = P31(1) * P11(1) + P32(1) * P21(1) + P33(1) * P31(1)
      Similarly, you can use any value of r

  • @vege941
    @vege941 4 года назад +2

    Good Stuff bro!! Keep up the good work. Your explanation and the video edit is damn cool

  • @johnmandrake8829
    @johnmandrake8829 4 года назад +1

    Can you also do a video on QR and Cholesky decomposition, determinants, and eigenvalues/eigenvectors

  • @KaushikJasced
    @KaushikJasced 2 года назад

    It is a wonderful and useful lecture series. Can you share some reading material associated with this lecture series?

  • @prajwalpitlehra3722
    @prajwalpitlehra3722 2 года назад

    Great illustration man! Btw, what is the music that's played when we are multiplying A infinitely?

  • @ИринаДьячкова-и5ф
    @ИринаДьячкова-и5ф 3 года назад

    When you brought up Chapman-Kolmogorov theorem proof and said “you can go through it” this felt slightly off putting, because I couldn't. I mean, everything else is so clearly stated, and looks like I should've understood the theorem proof too, but failed to. I dunno whether I'd feel better if the proof was explained or if I knew what exactly the proof notation means or if it was in the description.
    Anyway, your channel is great, you're so good at breaking complicated stuff down to simple stuff, I just wanted to give some feedback

    • @NormalizedNerd
      @NormalizedNerd  3 года назад

      Thanks for your honest feedback. Well, I didn't explain the derivation because that kinda bores people. Here's a tip to understand the derivation: take any two states i and j, n=4. Now try to compute P_ij(n) using r=0,1,2,3. You'll see how things work.

  •  Год назад

    Please talk about the use of Markov Chains in Monte Carlo Method

  • @indrafirmansyah4299
    @indrafirmansyah4299 2 года назад

    Very nice visualization of concept! what software did you use to make the visualization and animation?

  • @MrPrincemohanty
    @MrPrincemohanty 4 года назад

    Fantastic Explanation...

  • @mohammadahsan7873
    @mohammadahsan7873 3 года назад +2

    Your presentation is fabulous.
    Make more such videos related to ML, AI, DS...
    Thanks a lot for this video. It helps me a lot clearing all my doubt. ❤️

  • @Learndatasci
    @Learndatasci 4 года назад

    Great explanation! The yellow highlighting color you use on the Markov chain is a little tough to see, though.

  • @jerrynakoja4331
    @jerrynakoja4331 2 года назад

    Hello 👋
    Please kindly do a video on limiting distribution

  • @abhishekarora4007
    @abhishekarora4007 3 года назад

    Great video. Loved Grant Anderson style tutorials :)

  • @raihanzaki5
    @raihanzaki5 9 месяцев назад

    5:55 alternative proof can be derived from matrix multiplication.

  • @ivanportnoy4941
    @ivanportnoy4941 3 года назад

    Nice job here!

  • @EZ-lt7xg
    @EZ-lt7xg 3 года назад

    Great videos! Could you please clarify whether we just use the pi values we found in the first video to find A^infinity as I recall that in the first video we found that pi=[0.3591 0.2145 0.43564] which are not the same as 0.444 0.333 or 0.222 (which are the values of A^infinity). Thank you!

  • @RyanReed-uq9be
    @RyanReed-uq9be Год назад

    Just wanted to say there's been a little tiny error at ~4:30. It's stated that P_{ij}(n) = A_{ij}^n, that's not exactly correct in index notation, that's the equivalent of raising each individual element to a power and not the actual matrix. For n=2, P_{ij}(2) = A_{ik}A_{kj}, n=3 P_{ij}(3) = A_{ik}A_{kl}A_{lj}. It's a little more convoluted, but it would simply work the way you want with addition of parenthesis. P_{ij}(n) = (A^n)_{ij}

  • @luciafonsecadelabella3617
    @luciafonsecadelabella3617 3 года назад

    Thanks a ton!! You are the best!

  • @kacpers9463
    @kacpers9463 11 месяцев назад

    Good job bro !

  • @OmerMan992
    @OmerMan992 3 года назад

    Great videos!
    Would you consider making video/s on Queueing theory for stochastic models please?

  • @sunaryaseo
    @sunaryaseo 3 года назад

    Nice channel, I am waiting for this excellent explanation. Next, how about addressing the Attention Mechanism. Thank you so much for your excellent channel.

  • @vipingautam9501
    @vipingautam9501 2 года назад

    hey thanks for the video.

  • @joshustaran-anderegg9399
    @joshustaran-anderegg9399 3 года назад

    Thank you!

  • @iglesiaszorro297
    @iglesiaszorro297 3 года назад +1

    Super cool!!

  • @mazziskin
    @mazziskin 3 года назад +1

    ty for the content!

  • @gordonlim2322
    @gordonlim2322 3 года назад +1

    I think 3b1b will be proud that he has empowered you to make such great videos

    • @NormalizedNerd
      @NormalizedNerd  3 года назад +1

      Thanks a lot, mate! I can't thank him enough for making and publishing manim.

  • @anamikap6806
    @anamikap6806 3 года назад +1

    Tqu😍

  • @chintanmehta7354
    @chintanmehta7354 3 года назад

    Can you make videos on the intro of stochastic process

  • @SAFACTS
    @SAFACTS 10 месяцев назад

    hello person in the past !!! 😀😃😃

  • @sumers9396
    @sumers9396 2 года назад

    great video!! thanks a lot !

  • @usamamustafa7670
    @usamamustafa7670 2 года назад

    I think I saw Bayes' Theorem? Where did that bad boy pop out from?
    P.S Just started watching your videos... Awesome Work!

  • @haileydirks3559
    @haileydirks3559 3 года назад

    Hi, what is the difference between kolmogorov equation and n step transtions?

    • @NormalizedNerd
      @NormalizedNerd  3 года назад

      What are you referring to as the 'n step transition'?

  • @derusonedwards9322
    @derusonedwards9322 2 года назад

    Great content😅😅😅

  • @suryanarayanan5158
    @suryanarayanan5158 3 года назад

    Amazing video. It's like watching 3blue1brown :)

  • @likeapple1929
    @likeapple1929 Год назад

    Hi! Thanks for the video. I am wondering if n-step transition matrix is helpful building 2nd-order Markov Chain?

  • @丂卂丂乇匚卄丨卄卂
    @丂卂丂乇匚卄丨卄卂 10 месяцев назад

    Thx

  • @MorriganSlayde
    @MorriganSlayde 2 года назад

    Because of your explanations on matrices, during my exam instead of solving all the pis for the transition matrix, i just made my calculator power it to 100 to answer it correctly on my exam 😂😂😂

    • @NormalizedNerd
      @NormalizedNerd  2 года назад +1

      Well…if they allow calculators then why not! 🤭

    • @MorriganSlayde
      @MorriganSlayde 2 года назад

      @@NormalizedNerd it was pretty cool to have a "trick" up my sleeve not taught in class so thank you!

  • @EMMANUELTHESTUDENT
    @EMMANUELTHESTUDENT 4 месяца назад

    Hello welcome to people from future😊😊😊

  • @PranavSNair-hc9zk
    @PranavSNair-hc9zk 4 года назад

    yeeesssss
    I'm so here for indian 3b1b

  • @eceserin
    @eceserin 2 года назад

    Dude! who the fuck are you teaching me wayyyy better than my prof in uni? How dare you ^^

  • @katienefoasoro1132
    @katienefoasoro1132 2 года назад

    hi

  • @alex_zetsu
    @alex_zetsu 3 года назад

    The content is good, but you probably should put your mouth closer to the microphone because I turn up the volume to hear your voice but then the ad in the video blows up my headphones.

  • @WeedL0ver
    @WeedL0ver 4 года назад +1

    Ur accent sounds familiar
    Indian?

    • @NormalizedNerd
      @NormalizedNerd  4 года назад +1

      Yeah :)

    • @WeedL0ver
      @WeedL0ver 4 года назад

      @@NormalizedNerd I'm from Bangladesh

    • @NormalizedNerd
      @NormalizedNerd  4 года назад +1

      @@WeedL0ver Great. আমি বাঙালি :D

    • @WeedL0ver
      @WeedL0ver 4 года назад

      Lemme sub first xD

    • @WeedL0ver
      @WeedL0ver 4 года назад

      @@NormalizedNerd do u have discord? I found this video on reddit

  • @itsyourchanel2017
    @itsyourchanel2017 Месяц назад

    Hello person in the past😊

  • @mediwise2474
    @mediwise2474 Год назад

    Sir I need your email