Your explanation are clear and visual, which I love. Forget my boring lessons, all I need are your videos. Great job, keep it up! - 18 year old student.
Hi Patrick, I'm an italian guy studying statistics and finance on fourth year of university, I want to thank you for all your videos on Markov chains, they're very clear, complete and simple enough for any level of knowledge; For me here in Italy we have the "problem" that topics are addressed in a too theoretical manner (this way has his pro). Greetings, Nicolò.
Watching this one day before my finals. I haven't understood anything in my subject except for discrete markov chains thanks to @patrickJMT xD. Amazing video! Thank you so much!
I got C and C- in my year 1 maths courses, but in my second year I watched your videos and I got a B eventually! Not an excellent grade but there was some improvements. Thanks so much! @patrickJMT
I hope that you get paid or something for all the time and effort you put into your videos. If not, I just hope you know that you're helping a lot of people out there, and we really do appreciate your help.
No need to go to class when I have you. Is there a fund we can donate to, so that you can keep on providing good tutorials in different subject areas as well?
"For transition matrices for Markov chains with one or more absorbing states to have limiting matrices, we NEED one additional condition:" I think that condition is sufficient but not necessary. Consider a Markov model consisting of two disconnected "chains," one of which is absorbing and one of which is regular. Each chain has a steady state, and therefore so does the system as a whole, but the regular component doesn't have a path to the absorbing state. ex: 1 0 0 0 .9 .1 0 .7 .3
I've really appreciated your Markov Chain videos -- thanks. I'll just offer that sometimes, like in this discussion of absorbing states, I think the explanation could be made much clearer by showing the state-transition diagram for the matrix.
great video! just one suggestion. at the end of each video you could make like summing of most important thoughts told so someone can place that like in numerical order in short time just to learn it better before going to next video.keep up the good work!
I'm working on a google coding challenge right now, and I have to implement an algorithm that uses ideas from this concept. It's a great help, to say the least.
Hi PAtrick. Thank you very much for your videos. I have some question regarding expected number of steps to go from state i to j. and about expected time in continues Markov for ex expected time for going from state i to absorbing state j. Thank you very much
Could you maybe explain the second criterion in the Absorbing Markov Chains a bit further? In particular, it's odd that it includes the restriction about "finite number of steps". Is this some qualification that is here because of some as yet undiscussed application of Markov Chains? I just can't see what it would mean for the transition to be possible only if there WERE an infinite number of steps (otherwise why exclude it?). Thanks!
Hi Patrick, I have a question. As far as I could make out from the videos of this MC-Series that all absorbing Markov chains are regular but the converse isn't true. Am I right? And can you please tell me clearly what are the differences between the absorbing and regular MC?
What about if there is a two-way street would that still be possible for non absorbing states to go to an absorbing state? For instance we have .1 going to state B from state C and 0 going to state C from state B. Suppose, if we have some number..let's say .4 going to state C from state B. Now this a two way street. Would it still possible for these non-absorbing states to go to state A which is an absorbing state?? My other question is that, Let's suppose if we have two absorbing states such as state A and B and a non-absorbing state which is C and people are going from state A to state C and from B to C but there is no one going from State C to either of those absorbing state. Does that mean it's not possible for people/anything to go from state C to those absorbing states? Also, there are some people who stay in state C. Ex. A B C 1 0 .2 0 1 .3 0 0 .5
I understand that A matrix with an absorbing state may not have a limiting matrix. A matrix with an absorbing state with the additional condition must have a limiting matrix. I'm pointing out that a matrix with an absorbing state without the additional condition may still have a limiting matrix. Therefore we don't NEED that condition for the limiting matrix to exist; the condition is SUFFICIENT to conclude that the limiting matrix exists. I only have a guess of what is necessary and sufficient.
Hello Patrick, Can you share scanned copies of your notes shown in this video tutorial or provide us the link for any other text tutorial which explain Markov Chain in so simple manner as we need text tutorials to keep it handy. Thanks & Regards, Prateek Dixit
I'm not sure you're understanding me. You say that your result deals with all matrices with absorbing states. That is not the case. The powers of 1 0 0 0 .9 .1 0 .7 .3 approach 1 0 0 0 .875 .125 0 .875 .125 This chain has an absorbing state, but it is not possible to go from a non-absorbing state to the absorbing state, and therefore it does not meet your definition of an absorbing Markov chain. The limiting matrix exists nonetheless.
I don't think being a fan of him makes me any creepy, people like you, just why do you have to say such things? Is it too hard to think and speak normal and look at the world with good eyes for once?
Your explanation are clear and visual, which I love. Forget my boring lessons, all I need are your videos. Great job, keep it up! - 18 year old student.
Hi Patrick, I'm an italian guy studying statistics and finance on fourth year of university, I want to thank you for all your videos on Markov chains, they're very clear, complete and simple enough for any level of knowledge; For me here in Italy we have the "problem" that topics are addressed in a too theoretical manner (this way has his pro).
Greetings,
Nicolò.
Watching this one day before my finals. I haven't understood anything in my subject except for discrete markov chains thanks to @patrickJMT xD. Amazing video! Thank you so much!
I got C and C- in my year 1 maths courses, but in my second year I watched your videos and I got a B eventually! Not an excellent grade but there was some improvements. Thanks so much! @patrickJMT
thank you so much ,it helped me a lot and the best thing u did is by creating videos of short length which helps...
I hope that you get paid or something for all the time and effort you put into your videos. If not, I just hope you know that you're helping a lot of people out there, and we really do appreciate your help.
This seriously fantastic tutorial.
I can't thank you enough I guess.
I have to say I am grateful.
Thanks a lot....bro...a lot helpful after 8 years also..❤️❤️❤️❤️...🙏
you are most welcome
but i gave an example of a transition matrix with an absorbing state that does not have a limiting matrix so clearly something more is needed.
I love your videos. Amazing work Patrick.
hi there! this is really useful - it's clear, and made simple. Easy to comprehend. Thank you! :)
No need to go to class when I have you. Is there a fund we can donate to, so that you can keep on providing good tutorials in different subject areas as well?
if you visit my website and click on any video a paypal link pops up that no one ever donates to
or you can now visit my page on 'patreon', i'd love ya for it
did you ever donate the money or did u just wanna say it?
@@omardinhoR9 why do u care
What an awesome video. Thank you! Precisely the information I need for my use case.
Great video Patrick, thank you.
I got first rank in Mid-test because of this video. Really.
I'm five years late, but congratulations!
"For transition matrices for Markov chains with one or more absorbing states to have limiting matrices, we NEED one additional condition:"
I think that condition is sufficient but not necessary. Consider a Markov model consisting of two disconnected "chains," one of which is absorbing and one of which is regular. Each chain has a steady state, and therefore so does the system as a whole, but the regular component doesn't have a path to the absorbing state.
ex:
1 0 0
0 .9 .1
0 .7 .3
Thank you very much for all your videos on Markov chains, you made it very understandable :)
thank you, you are saving lives :')
I've really appreciated your Markov Chain videos -- thanks. I'll just offer that sometimes, like in this discussion of absorbing states, I think the explanation could be made much clearer by showing the state-transition diagram for the matrix.
Very clearly explained. :]
great video! just one suggestion. at the end of each video you could make like summing of most important thoughts told so someone can place that like in numerical order in short time just to learn it better before going to next video.keep up the good work!
Hey do you have any videos on First step Analysis/ expected number of visits of Markov chains
I'm working on a google coding challenge right now, and I have to implement an algorithm that uses ideas from this concept. It's a great help, to say the least.
It is even more descriptive than the mainland:D
Hi PAtrick. Thank you very much for your videos. I have some question regarding expected number of steps to go from state i to j. and about expected time in continues Markov for ex expected time for going from state i to absorbing state j. Thank you very much
great videos, I know they are old but if you haven't gotten one yet, a mic so you have louder and better quality would be great
Could you maybe explain the second criterion in the Absorbing Markov Chains a bit further? In particular, it's odd that it includes the restriction about "finite number of steps". Is this some qualification that is here because of some as yet undiscussed application of Markov Chains? I just can't see what it would mean for the transition to be possible only if there WERE an infinite number of steps (otherwise why exclude it?). Thanks!
thanks man!
Hi Patrick,
I have a question. As far as I could make out from the videos of this MC-Series that all absorbing Markov chains are regular but the converse isn't true. Am I right? And can you please tell me clearly what are the differences between the absorbing and regular MC?
What about if there is a two-way street would that still be possible for non absorbing states to go to an absorbing state? For instance we have .1 going to state B from state C and 0 going to state C from state B. Suppose, if we have some number..let's say .4 going to state C from state B. Now this a two way street. Would it still possible for these non-absorbing states to go to state A which is an absorbing state?? My other question is that, Let's suppose if we have two absorbing states such as state A and B and a non-absorbing state which is C and people are going from state A to state C and from B to C but there is no one going from State C to either of those absorbing state. Does that mean it's not possible for people/anything to go from state C to those absorbing states? Also, there are some people who stay in state C. Ex. A B C 1 0 .2
0 1 .3
0 0 .5
Do you have any video on finding customer life time value using markov chain
I understand that
A matrix with an absorbing state may not have a limiting matrix.
A matrix with an absorbing state with the additional condition must have a limiting matrix.
I'm pointing out that a matrix with an absorbing state without the additional condition may still have a limiting matrix. Therefore we don't NEED that condition for the limiting matrix to exist; the condition is SUFFICIENT to conclude that the limiting matrix exists. I only have a guess of what is necessary and sufficient.
ok, i think you are missing the point. the point is to cover all matrices with absorbing states, which is what the result states.
EPICARDO DURO
ok
What is limiting matrix??
Hello Patrick,
Can you share scanned copies of your notes shown in this video tutorial or provide us the link for any other text tutorial which explain Markov Chain in so simple manner as we need text tutorials to keep it handy.
Thanks & Regards,
Prateek Dixit
i dont have scanned notes, sorry
what is limiting matrix??
hi, can you upload tutorials for advance math and numericals as well? cause i really have a lot of troubled circumstances in there :)
One could say that drug abuse is our absorbing state and there's low probabilities we go from clean to casual use and then we can go to heavy abuse.
Great
I'm not sure you're understanding me. You say that your result deals with all matrices with absorbing states. That is not the case. The powers of
1 0 0
0 .9 .1
0 .7 .3
approach
1 0 0
0 .875 .125
0 .875 .125
This chain has an absorbing state, but it is not possible to go from a non-absorbing state to the absorbing state, and therefore it does not meet your definition of an absorbing Markov chain. The limiting matrix exists nonetheless.
no, i do not like facebook.
do you have like a official facebook page of your own? :P can i have a link please :) ?
facebook = non regular Markov Chain Process!
claveeeeeeeeeeeeeeeeeeeeeeeeeee
I don't think being a fan of him makes me any creepy, people like you, just why do you have to say such things? Is it too hard to think and speak normal and look at the world with good eyes for once?
You're creepy