Lecture #4: Solved Examples of Markov Chain using TPM (Part 3 of 3)

Поделиться
HTML-код
  • Опубликовано: 11 сен 2024
  • For Book: See the link amzn.to/2NirzXT
    This lecture explains how to Solve the Problems of the Markov Chain using TRANSITION PROBABILITY MATRIX. #OptimizationProbStat
    Other videos ‪@DrHarishGarg‬
    Lecture 1: Markov Chain & TPM: • Lecture #1: Stochastic...
    Lecture 2: Problems using TPM Part 1: • Lecture #2: Solved Pro...
    Lecture 3: Problems using TPM Part 2: • Lecture #3: Solved Pro...
    Lecture 4: Problems using TPM Part 3: • Lecture #4: Solved Exa...
    Lecture 5: Stationary Probabilities: • Lecture #5: Stationary...
    Lecture 6: Chapman-Kolmogorov Equation & Theorem: • Chapman-Kolmogorov Equ...

Комментарии • 87

  • @chiragsahu115
    @chiragsahu115 2 года назад +19

    In our country, most doctorate degree prof doesn't know how to teach and they are not interested. But u are passionate and interested in teaching students. Thank u sir

    • @DrHarishGarg
      @DrHarishGarg  2 года назад +6

      Thank you dear... ❤🙏 its my pleasure to teach all of you.....

  • @jimmartin6803
    @jimmartin6803 2 года назад +20

    You are so awesome Dr. Harish Garg, I was trying to learn from youtube how to solve these kind of problems and your playlist is super precise and very well explained. It took me 2 hours to find your playlist in youtube and I am not kidding you, I learned the way to solve these kind of problems faster than what it took me to find your channel and about 10x faster than the time I spent reading the theory and not understanding a single thing because there where no examples.

    • @DrHarishGarg
      @DrHarishGarg  2 года назад +6

      Many thank Jim for watching....kindly share with others too so that they will also learn in a simpler manner.... Happy learning.

  • @cjs8037
    @cjs8037 3 года назад +4

    Sir now only I got clear on Markov chain problems, thanking you sir 🙏🙏🙏

    • @DrHarishGarg
      @DrHarishGarg  3 года назад +2

      Wow.... So nice and happy.... Happy learning....
      Kindly share the videos with other so that they can also learn easily

    • @cjs8037
      @cjs8037 3 года назад +1

      @@DrHarishGarg sure sir

  • @hr.sharif0933
    @hr.sharif0933 3 года назад +3

    You r great sir...!!!explanation r so much easy..
    I request u... Plz sir,, make a video about classification of state...i hope u will solve this type of problems Step by step ❤...sir plz...plz...plz........

  • @amritaajain2170
    @amritaajain2170 3 года назад +6

    Really explained very clearly. Thank you so much 🙏🙏🙏

  • @shellypandey4316
    @shellypandey4316 3 года назад +2

    You are the life saver

    • @DrHarishGarg
      @DrHarishGarg  3 года назад

      It's my pleasure.... Watch others lectures and share with others

  • @samuelmaina7130
    @samuelmaina7130 Год назад +6

    I think there's a mistake in the last example, you used q0 instead of q1

    • @tvgathu9845
      @tvgathu9845 11 месяцев назад

      i thought so too, i think he should clarify

    • @yogendravishwakarma5592
      @yogendravishwakarma5592 10 месяцев назад +1

      No need to comment for any mistake or error just learn the correct things whatever sir explained. Once you understand, you will be able to correct that by yourselves.

  • @knowntounknowns5135
    @knowntounknowns5135 4 года назад +4

    Sir, I have a doubt.
    In finding, P(X_3=3,X_1=1)
    For this we have to find P_13^(2) .q_1(1) only right. But in your video, you found P_13^(2) .q_0(1). Why sir? I can't understand. Please explain.

  • @mustufaabidi7905
    @mustufaabidi7905 4 года назад +1

    As always explanation in very nice and simple manner.

    • @DrHarishGarg
      @DrHarishGarg  4 года назад

      So nice of you. Keep watching and sharing other videos also

  • @ravighimire8572
    @ravighimire8572 3 года назад

    Thank you Dr. Harish for clear explanation of Markov Chains (3/3 videos)

    • @DrHarishGarg
      @DrHarishGarg  3 года назад

      You're most welcome. Keep watching

  • @ritu1071
    @ritu1071 3 года назад +1

    Thank you very much for such clear and crisp explanation. Really appreciate your efforts.👍

    • @DrHarishGarg
      @DrHarishGarg  3 года назад +1

      Its my pleasure .... Many thanks for watching.

  • @cjs8037
    @cjs8037 3 года назад +1

    Really all those examples are very easy to understand ji...

    • @DrHarishGarg
      @DrHarishGarg  3 года назад

      Happy learning.... Keep sharing and watch other topics also in the easiest manner.

  • @AnjuKumari-pt7dc
    @AnjuKumari-pt7dc 5 месяцев назад

    Nobody taught like u
    Thank u sir

  • @bhushanbarde5041
    @bhushanbarde5041 4 года назад +2

    Thank u so so much! 🙏

  • @zoyakim206
    @zoyakim206 3 года назад +1

    Good explanation
    Helping me a lot for sem exam preparations 👏👏

  • @yogendravishwakarma5592
    @yogendravishwakarma5592 10 месяцев назад

    Thank you sir for amazing lecture series. I am maths student but finding statistics very interesting & easy because of you. Thanks a lot sir.

    • @DrHarishGarg
      @DrHarishGarg  10 месяцев назад

      It's my pleasure... Glad to hear that

  • @hhss1763
    @hhss1763 2 года назад +1

    Excellent as usual!!!

  • @chandruduk4309
    @chandruduk4309 3 года назад +1

    excellent videos...very clearly explained thanks a lot..

    • @DrHarishGarg
      @DrHarishGarg  3 года назад

      My pleasure.... Keep watching other videos too

  • @kusumkumari6894
    @kusumkumari6894 2 года назад +1

    Sir, can you please tell me do you provide tution for bsc statistics because I love the way you teach and I need your guidance in statistics so I am asking , please reply.

  • @sssskhan3
    @sssskhan3 4 года назад +1

    Very well explained with examples

  • @dominicfosu2485
    @dominicfosu2485 Год назад

    I really love your teachings ❤🎉i wish i noticed your channel earlier😊😊😊😊

  • @shruzzz
    @shruzzz 3 года назад

    thanks a lot sir! These markov chain lectures helped me a lot for my sem exams🙏.

    • @DrHarishGarg
      @DrHarishGarg  3 года назад

      Wow... Thats so nice.... Thanks for watching.... Keep in touch always

  • @techhackz2897
    @techhackz2897 3 года назад +1

    sir
    please explain the last example in the comment
    in the last example can we use conditional probability from the 2nd lecture of the this series

  • @jaforislam3395
    @jaforislam3395 Год назад

    Excellent explanation as always.

  • @xplorer5384
    @xplorer5384 Год назад

    Thank you for all 4 videos
    Clearly explained 👍

    • @DrHarishGarg
      @DrHarishGarg  Год назад

      So nice of you... Glad you liked them.... You may Watch Lecture 5 and 6 too... Details as below:
      Lecture 5: Stationary Probabilities: ruclips.net/video/zVCB49HYIxM/видео.html
      Lecture 6: Chapman-Kolmogorov Equation & Theorem: ruclips.net/video/TptjhklRnqM/видео.html

  • @surbhiswaraj6894
    @surbhiswaraj6894 4 месяца назад

    Thanku for the video sir
    It is very helpful 🙏

  • @dhirsalvi8611
    @dhirsalvi8611 4 месяца назад

    Amazing explaination

  • @adityaprasaddhal2462
    @adityaprasaddhal2462 3 года назад +4

    I'm getting confused in the last question 😕, it's some what similar to the 2nd method of solving Markov's chain

    • @techhackz2897
      @techhackz2897 3 года назад +1

      i have the same dought if we use the condition probability we get only 0.34

  • @chiranjibejana1444
    @chiranjibejana1444 4 года назад +1

    Nice presentation professor

    • @DrHarishGarg
      @DrHarishGarg  4 года назад

      Many thanks. Keep watching and sharing

  • @AdityaSingh-ql9ke
    @AdityaSingh-ql9ke 2 года назад

    Thanks a lot sir, once again

  • @aa-gc3fg
    @aa-gc3fg 3 года назад +1

    Thank you Soo much sir

  • @gadwalabdulla974
    @gadwalabdulla974 Год назад

    Thanks a lot sir❤️🫡

  • @Ishorts-tc4sp
    @Ishorts-tc4sp 2 года назад

    Awesome sir

  • @somu42
    @somu42 2 года назад

    i had a question to find unit fixed probability vector of the Matrix
    how can i do that?

  • @nilaislam5006
    @nilaislam5006 2 года назад

    Thanks

  • @candy_br09
    @candy_br09 9 месяцев назад

    I think there is a smalll mistake in last examle, first we should consider,, (q0*P)(1) * P^2(13)

  • @maths457
    @maths457 Год назад

    Sir P(X1=3) Kya hoga?

  • @ritanshitrivedi4278
    @ritanshitrivedi4278 3 года назад

    Informative content

  • @cjs8037
    @cjs8037 3 года назад +1

    Sir please tell me what book have to purchase for basic stochastic processes

    • @DrHarishGarg
      @DrHarishGarg  3 года назад

      As such there are so many books on stochastic processes....if you want to buy then you can choose any one. All books are good.

    • @DrHarishGarg
      @DrHarishGarg  3 года назад

      For Book: See the link amzn.to/2NirzXT

  • @sajidhasnat4457
    @sajidhasnat4457 3 года назад

    Thanks 👍😊

  • @aparnamishra2972
    @aparnamishra2972 Год назад

    Sir last example me P(X3=1, X1=1)
    Me
    q1(1) use hoga?????

  • @tvgathu9845
    @tvgathu9845 11 месяцев назад

    in he last example; P(x3=3,x1=1) , shouldnt we first calculate q1(1)?

    • @DrHarishGarg
      @DrHarishGarg  11 месяцев назад

      Yes, you need to compute q1(1)...

    • @vinayaksinghal
      @vinayaksinghal 9 месяцев назад

      @@DrHarishGarg sir video description mein update kardiya karo please

  • @user-rd1ok6zs3f
    @user-rd1ok6zs3f 9 месяцев назад

    WOW

  • @shrawankumarpatel2148
    @shrawankumarpatel2148 Год назад

    🌷🌷🌷

  • @shortWM
    @shortWM Год назад

    • @DrHarishGarg
      @DrHarishGarg  Год назад

      Many thanks....
      Join my telegram link of Probability and Statistics
      t.me/gargProbability

  • @santhoshnaidukoribilli
    @santhoshnaidukoribilli 3 года назад

    hi sir can you give us solution form transition matrices
    Let Xn ∈ Z≥0 be the amount of water (say measures in the unit of hundred
    million gallons) in MacRitchie Reservoir at noon on day n. During the 24 hours period beginning at
    this time, Yn ∈ Z≥0 units of water flows into the reservoir. The maximum capacity of the reservoir
    is 5 units, and excessive inflows are spilled and lost. Just before noon on each day, exactly 1 unit of
    water is removed if the reservoir is non-empty, and no water is removed if the reservoir is dried up.
    Assume that Y1, Y2, . . . are iid discrete RVs, where1
    Pr(Y1 = 0) = 1\4
    , Pr(Y1 = 1) = 1\6
    ,
    Pr(Y1 = 2) = 1\12
    2
    , Pr(Y1 = 3) = 1\6
    , Pr(Y1 = 4) = 1\3
    .
    In fact, {Xn}

    n=1 forms a DTMC. Write down the transition probability matrix of the DTMC

  • @mohinishendye1040
    @mohinishendye1040 2 года назад

    👍👍

  • @akshatgoyal9113
    @akshatgoyal9113 2 года назад +1

    Last example is wrong. We need to find q1 of state 1 by using q1=q0*p and than use state 1 from 1 as our probability

    • @DrHarishGarg
      @DrHarishGarg  2 года назад +3

      Thanks

    • @akshatgoyal9113
      @akshatgoyal9113 2 года назад +1

      @@DrHarishGarg btw thanks sir for the great videos learnt it off you only.

    • @DrHarishGarg
      @DrHarishGarg  2 года назад +2

      My pleasure dear always.... Keep watching other videos too and share with others. Thanks and cheers

  • @wylercambod2955
    @wylercambod2955 3 года назад

    Zero to jero

  • @celebrities01
    @celebrities01 2 года назад

    Thank you so much, sir ji🙏

  • @sunandapatra2774
    @sunandapatra2774 2 года назад

    Thank you so much sir.

    • @DrHarishGarg
      @DrHarishGarg  2 года назад

      My pleasure.... Keep watching and sharing