Lecture #2: Solved Problems of the Markov Chain using TRANSITION PROBABILITY MATRIX Part 1 of 3

Поделиться
HTML-код
  • Опубликовано: 11 сен 2024
  • For Book: See the link amzn.to/2NirzXT
    This lecture explains how to Solv the Problems of the Markov Chain using TRANSITION PROBABILITY MATRIX. #OptimizationProbStat
    Other videos ‪@DrHarishGarg‬
    Lecture 1: Markov Chain & TPM: • Lecture #1: Stochastic...
    Lecture 2: Problems using TPM Part 1: • Lecture #2: Solved Pro...
    Lecture 3: Problems using TPM Part 2: • Lecture #3: Solved Pro...
    Lecture 4: Problems using TPM Part 3: • Lecture #4: Solved Exa...
    Lecture 5: Stationary Probabilities: • Lecture #5: Stationary...
    Lecture 6: Chapman-Kolmogorov Equation & Theorem: • Chapman-Kolmogorov Equ...

Комментарии • 117

  • @sidhantmishra3049
    @sidhantmishra3049 Год назад +27

    I rarely comment on any youtube video.But the clarity of explanation in this video given the scarcity of this topic on yt is immense.

  • @user-ur9km8cc9d
    @user-ur9km8cc9d 2 года назад +14

    you explain everything clearly and with ease! I'm always stressed when watching educational videos but its never the case with your videos! many thanks.

  • @gamer7258
    @gamer7258 10 месяцев назад +3

    Thankyou very much from bottom of my heart for this series of markov chain i am really glad people like you understand students minutes problem and come with detailed , lucid explanation even those who have no base can grasp the concept seamlessly

  • @anshallu3235
    @anshallu3235 3 года назад +7

    Clearly explained each concept and correctly divided the topics. From Kerala💥💥

    • @DrHarishGarg
      @DrHarishGarg  3 года назад +1

      Thanks for watching... Kindly explore other videos also

  • @attackN
    @attackN Год назад +1

    sending my regards from Turkey for your clear teachings, wishing you the best, doctor.

    • @DrHarishGarg
      @DrHarishGarg  Год назад

      Thanka and My pleasure... Keep sharing with other students too so that they will also learn with easiest manner... Happy learning...

  • @monijesujolayemi2575
    @monijesujolayemi2575 9 месяцев назад +1

    Watching from Nigeria and you just helped me understand this markov chain so well ...thank you so much ..I'm so happy .. preparing for my final year exam in statistics

  • @xuanyisong5676
    @xuanyisong5676 2 года назад +6

    You are my savior professor! Hope you can make more videos of stochastic process

    • @DrHarishGarg
      @DrHarishGarg  2 года назад +2

      Already 7 videos made .... Check all

  • @cs_soldier5292
    @cs_soldier5292 3 года назад +4

    Thank you so much sir...
    I searched so many videos that were of no use. Finally I found this video which helped me clear all the concepts.
    Once again tq sir...

    • @DrHarishGarg
      @DrHarishGarg  3 года назад

      Thanks. Kindly Watch other parts also.... You will definitely learn the full concept of Markov chain.

    • @chiragahlawat465
      @chiragahlawat465 2 года назад

      My Sir is GOD in Mathematics and I am very lucky to learn statistics from Sir :)

  • @prabhatmotwani8472
    @prabhatmotwani8472 3 года назад +19

    Every step is well understood thanks for this wonderful and informative video

  • @zelihaoz650
    @zelihaoz650 3 года назад +3

    ı tried to understand this topic and ı watched some videos.However in this video he explained easily this topic that is very difficult for me.Thanks for your video:)

    • @DrHarishGarg
      @DrHarishGarg  3 года назад +1

      Many thanks... Keep watching other videos also .you will get alot of benefit

  • @kavvyakishan2075
    @kavvyakishan2075 Год назад +2

    I got a doubt. In the first example, yoi said q0 is probability on the first day, then shouldn't q1 be the probability on the second day? How is q2 probability on the second day?

  • @trustinyouu
    @trustinyouu 3 года назад +1

    Dr.Harish, wonderful session. Thank you -Dr.Kumara Raja Singh.G

    • @DrHarishGarg
      @DrHarishGarg  3 года назад

      Thank.. Keep watching other videos too..

  • @gobezeassefa1437
    @gobezeassefa1437 3 года назад +4

    It really helps. Thank you!

  • @ShivamSingh-vo8tv
    @ShivamSingh-vo8tv Год назад

    Good morning sir! You did greats of your duties and I want to may Nature bless you to doing your duties in such a way consistently also.

  • @buarikazeem4156
    @buarikazeem4156 2 года назад +1

    This video is really informative...searched for this explanation for months on RUclips.
    However, I will like to seek the audience help on a particular markov chain question

  • @Owais486
    @Owais486 5 месяцев назад

    Your explaination is really great sir !

  • @Tina_jayden
    @Tina_jayden 2 года назад +1

    Wow thank you so much professor, clearly explained . you are the best

    • @DrHarishGarg
      @DrHarishGarg  2 года назад

      Many thanks for watching...keep watching next parts too for more understanding

  • @sssskhan3
    @sssskhan3 4 года назад +3

    Very well explained.

  • @amitsuyal5916
    @amitsuyal5916 4 года назад +3

    Very informative! Eagerly waiting for other videos.

    • @DrHarishGarg
      @DrHarishGarg  4 года назад

      Thanks. Next video is uploaded. Kindly watch.
      #OptimizationProbStat

  • @harshlohia26
    @harshlohia26 Год назад

    thankyou so much for explaining so clearly.God bless you!

  • @salonichoudhary1327
    @salonichoudhary1327 4 года назад +3

    Sir please make a video on classification of states of markov chain .

  • @surbhiyadav5491
    @surbhiyadav5491 Год назад

    Sir, these lectures are very helpful for CSIR NET. Please make videos on PYQ on markov chain of CSIR NET. Thank you🙏

  • @rajesh6c530
    @rajesh6c530 7 месяцев назад

    Thank you very much for such a nice explanation

  • @shailysharma8904
    @shailysharma8904 Год назад

    Really thankyou so much for this....this is just amazing......

  • @armarmarm2809
    @armarmarm2809 4 года назад +4

    can u do a tutorial of MARKOV CHAIN LOTTERY ..thank you Dr. Harish Garg

    • @DrHarishGarg
      @DrHarishGarg  4 года назад +1

      OK. I will consider in my video on Markov Chain

  • @HarishKumar-ns7ld
    @HarishKumar-ns7ld 3 года назад +1

    very helpful SIR, very easy to understand

    • @DrHarishGarg
      @DrHarishGarg  3 года назад

      Most welcome.. Watch other lecture on Markov Chain (Lecture 3,4,5) for complete understanding

  • @shrawankumarpatel2148
    @shrawankumarpatel2148 Год назад

    Thanks a lot you make easy to statistics for all student 🙏💗💗💗🙏

  • @jinotomjacob3069
    @jinotomjacob3069 3 года назад +1

    Thank you sir for this topic
    Can you please upload videos about the topic- Brownian motion

  • @krishnavenim2221
    @krishnavenim2221 3 года назад +1

    wonderful explantion.thank you sir

    • @DrHarishGarg
      @DrHarishGarg  3 года назад

      My pleasure.... You can also watch Lecture #3 -#5 on Markov Process for full topic

  • @Stewie_official
    @Stewie_official 2 года назад

    Thanks Sir 🙏...You made it so easy

    • @DrHarishGarg
      @DrHarishGarg  2 года назад

      Thanks for watching ... Keep watching other videos too and share with others...

  • @ToshibShaikh
    @ToshibShaikh Год назад

    Best

  • @Herttt747
    @Herttt747 2 года назад

    Was very helpful thank you sooooo much

  • @Carlitos_SH
    @Carlitos_SH 2 года назад

    Massive thank you !

  • @Mehraj_IITKGP
    @Mehraj_IITKGP 7 месяцев назад

    In example 1, if I solve the problem using the normal approach, I get a different answer. Here is my approach:
    we need to find the probabilty that person chooses train on the second day, this can be obtained as:
    P(T on day 2 | C on day 1) + P(T on day 2 | B on day 1) + P(T on day 2 | T on day 1)
    = (0.7) * (0.4) + (0.2) * (0.2) + (0.1) * (0.3) = 0.35
    Could anyone figure out where am I wrong?
    Any help is appriciated.

  • @abhishekshukla2974
    @abhishekshukla2974 7 месяцев назад

    Very helpfull .thank you

  • @riyabisht5935
    @riyabisht5935 3 месяца назад

    How did he calculate p2 can someone tell me? The final outcome of matrix how did he calculate that?

  • @mehuldeshpande5925
    @mehuldeshpande5925 2 года назад

    Very well explained sir

  • @K.Krushna
    @K.Krushna 9 месяцев назад

    Sir, I have export data, will you please guide how to find TPM, & Apply Markov chain to it.
    Please help me.....

  • @MsAaradhyaChopra
    @MsAaradhyaChopra 4 года назад

    What is role of transition probability in transition from one state to other state if we want transition from state i to state j only it required to be high or low

  • @rishithareddy5546
    @rishithareddy5546 2 года назад

    Thankyou for the clear Explanation Sir:-)

  • @ritanshitrivedi4278
    @ritanshitrivedi4278 3 года назад

    Very informative lecture

  • @bosslady4591
    @bosslady4591 9 месяцев назад

    Can you give the solution with R studio ?

  • @cjs8037
    @cjs8037 3 года назад +1

    Sir kindly explain the applications on stochastic processes sir...
    Sir please explain where we can use this...in general...

    • @DrHarishGarg
      @DrHarishGarg  3 года назад +1

      There is a wide applicability of stochastic process such as Decision theory , bayesian analysis, forecasting, optimization, etc

    • @cjs8037
      @cjs8037 3 года назад +1

      @@DrHarishGarg thanking you sir

    • @cjs8037
      @cjs8037 3 года назад +1

      @@DrHarishGarg can you send me your email id sir

  • @AsifKhan-kk8wo
    @AsifKhan-kk8wo Год назад

    I'm stuck in multiplication, how to calculate prob. together?

  • @HuyNguyen-rv9ge
    @HuyNguyen-rv9ge 3 года назад

    Very useful video. Thank you Dr Harish. Do you have any Stata codes to work on Markov model?

    • @DrHarishGarg
      @DrHarishGarg  3 года назад

      Sorry.... I dont have such STATA code

    • @HuyNguyen-rv9ge
      @HuyNguyen-rv9ge 3 года назад

      @@DrHarishGarg thank you alot anyway

  • @poojaprasad5653
    @poojaprasad5653 3 года назад +1

    Sir plz tell if it has been asked to find prob(x1=X2=X3=x4=2)

    • @DrHarishGarg
      @DrHarishGarg  3 года назад

      Thanks... Yes, it can be asked. It is basically the Markov Chain as P(X1=2, X2=2, X3=2,X4=2). For this, see how to solve such problems in Lecture #4 on Markov chain, available in playlist

    • @poojaprasad5653
      @poojaprasad5653 2 года назад

      @@DrHarishGarg thank u very much sir

  • @aryansingh222
    @aryansingh222 3 года назад

    Aap to dewta svaroop ho sir ji.. 🤗🙏🙏

    • @DrHarishGarg
      @DrHarishGarg  3 года назад

      Thanks for watching... Kindly watch other parts too for complete understanding

  • @Adeel_ansari_
    @Adeel_ansari_ 2 года назад

    amazing tutorial sir teach more topics please!!!

    • @DrHarishGarg
      @DrHarishGarg  2 года назад +1

      Thank you, All were already available in the playlist... otherwise let me know the topic

  • @amritaajain2170
    @amritaajain2170 3 года назад +1

    Sir if initial probabilities are not given then how to solve the same problem with another method .

    • @DrHarishGarg
      @DrHarishGarg  3 года назад +1

      If initial probability is not given, then you can found it by using either of the distribution function such as Bionomial, Poisson, Geometric etc . what lecture 3 and lecture 4 for the same

    • @amritaajain2170
      @amritaajain2170 3 года назад

      Ok thank you so much

  • @suridhifernando4186
    @suridhifernando4186 Год назад

    can u explain Example 2 (ii) in video (time:11.32)

  • @rashiberdia870
    @rashiberdia870 2 года назад

    I have doubt when p2 is into the how this ans comes .24 I don’t understand

  • @sushmitachakraborty9345
    @sushmitachakraborty9345 2 года назад +1

    You are not practically showing how are you multiplying P. P
    Do we need to add and then multiply?! We didn't get it

    • @DrHarishGarg
      @DrHarishGarg  2 года назад

      P.P is standard two matrix multiply. ... Use the calculator or simply multiply row elements to Column one by one, as did in matrix multiplication.

  • @cjs8037
    @cjs8037 2 года назад

    Namaste ji, can you give an explanation for Chapman kolmogrov equation

    • @DrHarishGarg
      @DrHarishGarg  2 года назад

      OK... I will explain soon

    • @DrHarishGarg
      @DrHarishGarg  2 года назад

      Chapman Kolmorogorv video lecture is uploaded... You may watch please

    • @anupamanandi7846
      @anupamanandi7846 2 года назад

      @@DrHarishGarg sir how to calculate state probability if initial probability is not given ..

  • @abdullahal-qudaimi1969
    @abdullahal-qudaimi1969 4 года назад

    great Sir.

  • @kapilshekhar484
    @kapilshekhar484 2 года назад

    Sir please suggest some book/s related to topic. 🙏

    • @DrHarishGarg
      @DrHarishGarg  2 года назад +1

      For Book: See the link amzn.to/2NirzXT

    • @kapilshekhar484
      @kapilshekhar484 2 года назад

      @@DrHarishGarg thank you so much sir for your response.

  • @AnjaliMathOdyssey
    @AnjaliMathOdyssey 3 года назад

    Sir i want to learn this topic for competitive exams lecturer cadre mcq types ques

  • @rajput11778
    @rajput11778 4 года назад

    Sir what is the examples of state probablity nd transition probablity? Plz rply nd write the steps process for n jobs through two nd three machines?

    • @DrHarishGarg
      @DrHarishGarg  4 года назад

      Watch (Part 2 of 3) and (Part 3 of 3) for the examples of state probability and transition probability.

  • @nanigudimetla8002
    @nanigudimetla8002 3 года назад

    Your teaching is superb but your voice is very low

    • @DrHarishGarg
      @DrHarishGarg  3 года назад

      Thanks for watching and the feedback

  • @user-tu1kf6xd8d
    @user-tu1kf6xd8d 3 года назад

    I thought we just multiply the matrix in order to find the probability

  • @knowntounknowns5135
    @knowntounknowns5135 4 года назад +1

    Sir, how to calculate prob of p(x3=2,x2=3,x1=3,x0=2) please explain

    • @DrHarishGarg
      @DrHarishGarg  4 года назад +1

      For it, you need Initial probabilities i.e., p(0) and TPM P. Once you have these , then required probability is
      p_0(2)x p_23 x p_33 x p_32
      Here p_23 means element of matrix P at the position 2nd row and 3red Column. Similar meaning for p_33 and p_32.
      While p_0(2) means the initial probability of state 2.
      I hope it will clear you.

    • @aryansingh222
      @aryansingh222 3 года назад

      @@DrHarishGarg biggg efforts. ☺☺

    • @shaikmehnaaztabasum7460
      @shaikmehnaaztabasum7460 3 года назад

      @@DrHarishGarg Thank you so much sir very nice explanation........

    • @careshare3620
      @careshare3620 3 года назад

      @@DrHarishGarg If I have historical data of random events for three states state A, B and C. How can I establish the initial probabilities P(0) and TPM P ?

  • @Manu-jx8hs
    @Manu-jx8hs 2 года назад

    Sir how you write qn=qo *P^(n)?

    • @DrHarishGarg
      @DrHarishGarg  2 года назад

      Kindly watch lecture no. 1 -TPM and n- step probability concept

  • @tanmay094
    @tanmay094 3 года назад

    Sir,How can I calculate E[X2]?

    • @DrHarishGarg
      @DrHarishGarg  3 года назад +1

      E(X2) = summation (x^2 p) for discrete variables and
      Integration x^2 f(x) dx for continuous variable

    • @tanmay094
      @tanmay094 3 года назад

      @@DrHarishGarg Sir,Can you please explain how can I calculate E[X2] in Example 0 of this lecture of yours?

    • @tanmay094
      @tanmay094 3 года назад

      This explanation of yours would be really helpful

    • @tanmay094
      @tanmay094 3 года назад

      This is actually expectation at time step 2

  • @ashujadhav5457
    @ashujadhav5457 3 года назад +1

    Sir please make video classification of States of Markov chain with example