Markov Chains - Part 7 - Absorbing Markov Chains and Absorbing States

Поделиться
HTML-код
  • Опубликовано: 11 дек 2024

Комментарии • 55

  • @alex4soccer4
    @alex4soccer4 10 лет назад +2

    Your explanation are clear and visual, which I love. Forget my boring lessons, all I need are your videos. Great job, keep it up! - 18 year old student.

  • @nicolomanca9479
    @nicolomanca9479 10 лет назад +4

    Hi Patrick, I'm an italian guy studying statistics and finance on fourth year of university, I want to thank you for all your videos on Markov chains, they're very clear, complete and simple enough for any level of knowledge; For me here in Italy we have the "problem" that topics are addressed in a too theoretical manner (this way has his pro).
    Greetings,
    Nicolò.

  • @vishnu.f1
    @vishnu.f1 4 года назад

    Watching this one day before my finals. I haven't understood anything in my subject except for discrete markov chains thanks to @patrickJMT xD. Amazing video! Thank you so much!

  • @raphaelchu5716
    @raphaelchu5716 7 лет назад +4

    I got C and C- in my year 1 maths courses, but in my second year I watched your videos and I got a B eventually! Not an excellent grade but there was some improvements. Thanks so much! @patrickJMT

  • @transitor1
    @transitor1 11 лет назад

    thank you so much ,it helped me a lot and the best thing u did is by creating videos of short length which helps...

  • @tackiestproduct
    @tackiestproduct 11 лет назад +1

    I hope that you get paid or something for all the time and effort you put into your videos. If not, I just hope you know that you're helping a lot of people out there, and we really do appreciate your help.

  • @md.muidulalamtuhin1863
    @md.muidulalamtuhin1863 5 лет назад

    This seriously fantastic tutorial.
    I can't thank you enough I guess.
    I have to say I am grateful.

  • @dsandeep6840
    @dsandeep6840 4 года назад

    Thanks a lot....bro...a lot helpful after 8 years also..❤️❤️❤️❤️...🙏

  • @patrickjmt
    @patrickjmt  12 лет назад

    but i gave an example of a transition matrix with an absorbing state that does not have a limiting matrix so clearly something more is needed.

  • @sharifk9860
    @sharifk9860 6 лет назад

    I love your videos. Amazing work Patrick.

  • @xbwheea17
    @xbwheea17 9 лет назад +3

    hi there! this is really useful - it's clear, and made simple. Easy to comprehend. Thank you! :)

  • @agyarenk1
    @agyarenk1 10 лет назад +29

    No need to go to class when I have you. Is there a fund we can donate to, so that you can keep on providing good tutorials in different subject areas as well?

    • @patrickjmt
      @patrickjmt  10 лет назад +27

      if you visit my website and click on any video a paypal link pops up that no one ever donates to

    • @patrickjmt
      @patrickjmt  8 лет назад +9

      or you can now visit my page on 'patreon', i'd love ya for it

    • @omardinhoR9
      @omardinhoR9 5 лет назад +2

      did you ever donate the money or did u just wanna say it?

    • @rui1109
      @rui1109 5 лет назад +3

      @@omardinhoR9 why do u care

  • @bfunkt4313
    @bfunkt4313 7 лет назад

    What an awesome video. Thank you! Precisely the information I need for my use case.

  • @waynehughes6967
    @waynehughes6967 9 лет назад

    Great video Patrick, thank you.

  • @rafandralazuardi9463
    @rafandralazuardi9463 9 лет назад +3

    I got first rank in Mid-test because of this video. Really.

  • @Qhartb
    @Qhartb 12 лет назад

    "For transition matrices for Markov chains with one or more absorbing states to have limiting matrices, we NEED one additional condition:"
    I think that condition is sufficient but not necessary. Consider a Markov model consisting of two disconnected "chains," one of which is absorbing and one of which is regular. Each chain has a steady state, and therefore so does the system as a whole, but the regular component doesn't have a path to the absorbing state.
    ex:
    1 0 0
    0 .9 .1
    0 .7 .3

  • @MielPopsization
    @MielPopsization 9 лет назад +1

    Thank you very much for all your videos on Markov chains, you made it very understandable :)

  • @israabenabdallah9203
    @israabenabdallah9203 5 лет назад

    thank you, you are saving lives :')

  • @ClearerThanMud
    @ClearerThanMud 5 лет назад

    I've really appreciated your Markov Chain videos -- thanks. I'll just offer that sometimes, like in this discussion of absorbing states, I think the explanation could be made much clearer by showing the state-transition diagram for the matrix.

  • @IHeartViHart
    @IHeartViHart 12 лет назад

    Very clearly explained. :]

  • @rastabastard1321
    @rastabastard1321 11 лет назад

    great video! just one suggestion. at the end of each video you could make like summing of most important thoughts told so someone can place that like in numerical order in short time just to learn it better before going to next video.keep up the good work!

  • @Mahmzo
    @Mahmzo 6 лет назад

    Hey do you have any videos on First step Analysis/ expected number of visits of Markov chains

  • @retagainez
    @retagainez 3 года назад +1

    I'm working on a google coding challenge right now, and I have to implement an algorithm that uses ideas from this concept. It's a great help, to say the least.

  • @ComplexInformationMrEngineer
    @ComplexInformationMrEngineer 7 лет назад

    It is even more descriptive than the mainland:D

  • @payamm3761
    @payamm3761 12 лет назад

    Hi PAtrick. Thank you very much for your videos. I have some question regarding expected number of steps to go from state i to j. and about expected time in continues Markov for ex expected time for going from state i to absorbing state j. Thank you very much

  • @bionicsheep7131
    @bionicsheep7131 9 лет назад +1

    great videos, I know they are old but if you haven't gotten one yet, a mic so you have louder and better quality would be great

  • @jagmarz
    @jagmarz 12 лет назад

    Could you maybe explain the second criterion in the Absorbing Markov Chains a bit further? In particular, it's odd that it includes the restriction about "finite number of steps". Is this some qualification that is here because of some as yet undiscussed application of Markov Chains? I just can't see what it would mean for the transition to be possible only if there WERE an infinite number of steps (otherwise why exclude it?). Thanks!

  • @hmn4124
    @hmn4124 8 лет назад

    thanks man!

  • @dutta.alankar
    @dutta.alankar 8 лет назад

    Hi Patrick,
    I have a question. As far as I could make out from the videos of this MC-Series that all absorbing Markov chains are regular but the converse isn't true. Am I right? And can you please tell me clearly what are the differences between the absorbing and regular MC?

  • @aleezamohammad1334
    @aleezamohammad1334 9 лет назад

    What about if there is a two-way street would that still be possible for non absorbing states to go to an absorbing state? For instance we have .1 going to state B from state C and 0 going to state C from state B. Suppose, if we have some number..let's say .4 going to state C from state B. Now this a two way street. Would it still possible for these non-absorbing states to go to state A which is an absorbing state?? My other question is that, Let's suppose if we have two absorbing states such as state A and B and a non-absorbing state which is C and people are going from state A to state C and from B to C but there is no one going from State C to either of those absorbing state. Does that mean it's not possible for people/anything to go from state C to those absorbing states? Also, there are some people who stay in state C. Ex. A B C 1 0 .2
    0 1 .3
    0 0 .5

  • @shantanubose6736
    @shantanubose6736 12 лет назад

    Do you have any video on finding customer life time value using markov chain

  • @Qhartb
    @Qhartb 12 лет назад

    I understand that
    A matrix with an absorbing state may not have a limiting matrix.
    A matrix with an absorbing state with the additional condition must have a limiting matrix.
    I'm pointing out that a matrix with an absorbing state without the additional condition may still have a limiting matrix. Therefore we don't NEED that condition for the limiting matrix to exist; the condition is SUFFICIENT to conclude that the limiting matrix exists. I only have a guess of what is necessary and sufficient.

  • @patrickjmt
    @patrickjmt  12 лет назад

    ok, i think you are missing the point. the point is to cover all matrices with absorbing states, which is what the result states.

  • @lucianogatica458
    @lucianogatica458 5 лет назад

    EPICARDO DURO

  • @patrickjmt
    @patrickjmt  12 лет назад

    ok

  • @nimishasingh3983
    @nimishasingh3983 5 лет назад

    What is limiting matrix??

  • @prateekvitian
    @prateekvitian 11 лет назад

    Hello Patrick,
    Can you share scanned copies of your notes shown in this video tutorial or provide us the link for any other text tutorial which explain Markov Chain in so simple manner as we need text tutorials to keep it handy.
    Thanks & Regards,
    Prateek Dixit

    • @patrickjmt
      @patrickjmt  11 лет назад

      i dont have scanned notes, sorry

  • @qazafisheikh530
    @qazafisheikh530 4 года назад

    what is limiting matrix??

  • @janespinkdiary274
    @janespinkdiary274 12 лет назад

    hi, can you upload tutorials for advance math and numericals as well? cause i really have a lot of troubled circumstances in there :)

  • @danielhuisman1996
    @danielhuisman1996 5 лет назад

    One could say that drug abuse is our absorbing state and there's low probabilities we go from clean to casual use and then we can go to heavy abuse.

  • @angelabright8645
    @angelabright8645 2 месяца назад

    Great

  • @Qhartb
    @Qhartb 12 лет назад

    I'm not sure you're understanding me. You say that your result deals with all matrices with absorbing states. That is not the case. The powers of
    1 0 0
    0 .9 .1
    0 .7 .3
    approach
    1 0 0
    0 .875 .125
    0 .875 .125
    This chain has an absorbing state, but it is not possible to go from a non-absorbing state to the absorbing state, and therefore it does not meet your definition of an absorbing Markov chain. The limiting matrix exists nonetheless.

  • @patrickjmt
    @patrickjmt  12 лет назад

    no, i do not like facebook.

  • @DodgeSpikes
    @DodgeSpikes 12 лет назад

    do you have like a official facebook page of your own? :P can i have a link please :) ?

  • @TheRaphaelriv
    @TheRaphaelriv 12 лет назад

    facebook = non regular Markov Chain Process!

  • @lucianogatica458
    @lucianogatica458 5 лет назад

    claveeeeeeeeeeeeeeeeeeeeeeeeeee

  • @DodgeSpikes
    @DodgeSpikes 11 лет назад

    I don't think being a fan of him makes me any creepy, people like you, just why do you have to say such things? Is it too hard to think and speak normal and look at the world with good eyes for once?

  • @EyhSteve
    @EyhSteve 11 лет назад

    You're creepy