Intro to Markov Chains & Transition Diagrams

Поделиться
HTML-код
  • Опубликовано: 14 май 2024
  • Markov Chains or Markov Processes are an extremely powerful tool from probability and statistics. They represent a statistical process that happens over and over again, where we try to predict the future state of a system. A markov process is one where the probability of the future ONLY depends on the present state, and ignores the past entirely. This might seem like a big restriction, but what we gain is a lot of power in our computations. We will see how to come up with transition diagram to describe the probabilities of shifting between different states, and then do an example where we use a tree diagram to compute the probabilities two stages into the future. We finish with an example looking at bull and bear weeks in the stock market.
    Coming Soon: The follow up video covers using a Transition Matrix to easily compute probabilities multiple states in the future.
    0:00 Markov Example
    2:04 Definition
    3:02 Non-Markov Example
    4:06 Transition Diagram
    5:27 Stock Market Example
    COURSE PLAYLISTS:
    ►CALCULUS I: • Calculus I (Limits, De...
    ► CALCULUS II: • Calculus II (Integrati...
    ►MULTIVARIABLE CALCULUS (Calc III): • Calculus III: Multivar...
    ►DIFFERENTIAL EQUATIONS (Calc IV): • How to solve ODEs with...
    ►DISCRETE MATH: • Discrete Math (Full Co...
    ►LINEAR ALGEBRA: • Linear Algebra (Full C...
    OTHER PLAYLISTS:
    ► Learning Math Series
    • 5 Tips To Make Math Pr...
    ►Cool Math Series:
    • Cool Math Series
    BECOME A MEMBER:
    ►Join: / @drtrefor
    Special thanks to Imaginary Fan members Frank Dearr & Cameron Lowes for supporting this video.
    MATH BOOKS & MERCH I LOVE:
    ► My Amazon Affiliate Shop: www.amazon.com/shop/treforbazett
    SOCIALS:
    ►Twitter (math based): / treforbazett
    ►Instagram (photography based): / treforphotography

Комментарии • 105

  • @SAAARC
    @SAAARC 3 года назад +21

    A subscription to your channel is the gift that keeps on giving.

    • @DrTrefor
      @DrTrefor  3 года назад +8

      Glad you're enjoying!

  • @Darkev77
    @Darkev77 3 года назад +46

    The most thorough and clear explanation ever. Can't wait for the next video!

    • @DrTrefor
      @DrTrefor  3 года назад +5

      Thank you! Follow up video coming next week:)

  • @mapmap123456
    @mapmap123456 11 месяцев назад

    one of the best explanations about Markov chains on youtube. thank you

  • @MrChinook1991
    @MrChinook1991 Год назад +7

    Very well explained mate. Videos like these are a great into to a statistical topic and is a great foundation to dive deeper into the math behind it

  • @imdadood5705
    @imdadood5705 2 года назад +2

    Very clear explanation with easy examples. Thank you!

  • @PrpleHatMan
    @PrpleHatMan Год назад +3

    A very well presented and insightful lesson.
    Thank you for the 2 part explanation!

  • @freedmoresidume
    @freedmoresidume 2 года назад +1

    Crystal clear explanation , thank you Dr.

  • @bijanghofranian5782
    @bijanghofranian5782 2 года назад +1

    I was searching for "Wiener-Lévy process which is also a Markov process" but luckily I ended up here, a wonderful serendipity! Thanks for the simple and concise explanation Dr. Trefor.

  • @catcen9631
    @catcen9631 2 года назад +2

    incredible video with a super clear explanation!

  • @joevanderstelt8143
    @joevanderstelt8143 2 года назад +1

    Subscribed to this channel after about 30 seconds.... amazing

  • @aqeelzeid24
    @aqeelzeid24 2 года назад +1

    I watched around 5 videos , this explained it the best !! thanks alot

  • @paulkaranja9264
    @paulkaranja9264 4 месяца назад +5

    You just made my life better all the way in a small country found in Africa called Kenya. Thank you.

  • @byronwilliams7977
    @byronwilliams7977 Год назад +2

    Love your videos man, keep up the great work. You and Presh Talwalker are the best.

  • @abdullahbinnaeem9502
    @abdullahbinnaeem9502 4 месяца назад

    Never seen such explanation before. Amazing Sir

  • @sjchsbc
    @sjchsbc 3 года назад +3

    Thank you for making this.

  • @maulikjadav9673
    @maulikjadav9673 2 года назад +1

    Awesome explanation, Thank you sir. 👍

  • @HA-zd5gx
    @HA-zd5gx 3 года назад +1

    I'm happy tha I found this channel. thank you!

  • @kaniki_the_problem
    @kaniki_the_problem 2 года назад +1

    You a natural. Thanks , preparing for Risk Modeling and Survival analysis actuarial exam

  • @Big_Mo_Zak
    @Big_Mo_Zak 6 месяцев назад

    Very well explained.thank you

  • @priyankashaw3010
    @priyankashaw3010 3 года назад +1

    really helped to understand this concept in the first go

  • @eleazertham5033
    @eleazertham5033 3 года назад +1

    Thanks I needed this for my upcoming exams :-)

  • @jesseluinstra1192
    @jesseluinstra1192 Год назад +1

    Thank you. This was a very good explanation

  • @sanklink
    @sanklink 2 года назад +1

    Awesome explanation!

  • @eyakhamassi8748
    @eyakhamassi8748 Год назад

    Thank you soo much this is the best explanation ever❤❤❤

  • @genuinebombayite1966
    @genuinebombayite1966 3 года назад +4

    This was such a wonderfully clear explanation. Thank you so much!

    • @DrTrefor
      @DrTrefor  3 года назад +2

      Glad it was helpful!

    • @genuinebombayite1966
      @genuinebombayite1966 3 года назад

      @@DrTrefor Not to get too greedy, but can you do one on Hidden Markov models? Thanks a bunch again!

  • @dimitrychi
    @dimitrychi 3 года назад +1

    Perfect. Dr. Trefor you are the best!

    • @DrTrefor
      @DrTrefor  3 года назад

      Thank you, glad you enjoyed!

  • @camsasuncion
    @camsasuncion 3 года назад +1

    I really appreciate the clarity of the explanation. I am now a subscriber and a fan! Thank you.

    • @DrTrefor
      @DrTrefor  3 года назад

      Thanks for the sub!

  • @satyambhardwaj2289
    @satyambhardwaj2289 3 года назад +3

    absolute delight.. exactly at the time I wanted it for an AI implementation

  • @randa_alwadi
    @randa_alwadi Год назад +1

    Your explanation is super!

  • @drachenschlachter6946
    @drachenschlachter6946 7 месяцев назад

    A very good explanation!

  • @anishjoshi1999
    @anishjoshi1999 2 года назад +1

    Thank you so much Doctor 😍

  • @TPLCreationLoft
    @TPLCreationLoft Год назад

    Best explanation of Markov Chains I've seen. Most videos don't explain how you get the initial probabilities, but from your explanation I understood that they're equality distributed at outset (that is if I understood correctly) and can stabilize as frequency outcomes over iterations . Thank you.
    However, one thing that wasn't too clear on was if a Markov chain only depends on the current state of predicting future states, wouldn't a tree that predicts into the near or distant future states not be using the Markov property since there's a whole chain of dependencies?

  • @Shaunmcdonogh-shaunsurfing
    @Shaunmcdonogh-shaunsurfing Год назад

    Fantastic explanation

  • @MrBitviper
    @MrBitviper 10 месяцев назад +2

    amazing explanation. you have a knack for making these difficult topics understandable
    thank you so much for this

    • @DrTrefor
      @DrTrefor  10 месяцев назад +2

      Glad it was helpful!

  • @TheMostafa5000
    @TheMostafa5000 4 месяца назад

    amazing explanation.

  • @rashasulieman2525
    @rashasulieman2525 2 года назад

    Wow! Just beautiful! Love way you explain things!

    • @DrTrefor
      @DrTrefor  2 года назад +1

      Thank you so much!

  • @jenishghimire6678
    @jenishghimire6678 8 месяцев назад

    So good way of explanation

  • @thedigitalphysicist
    @thedigitalphysicist 2 года назад +4

    A simple explanation of something that could be very complexly understood. Thank you, Dr. Bazett

  • @peachiichaii6375
    @peachiichaii6375 2 года назад

    oh my for this was so helpful thank you so much

  • @ipshitaghosh2656
    @ipshitaghosh2656 3 года назад +7

    Rare moments.. when you understand in the first go.

  • @Storyguy
    @Storyguy 2 года назад

    Great Sir.... the explanation is ridiculous ❤❤

  • @imad_uddin
    @imad_uddin 2 года назад +2

    This was strikingly clear and fresh. Loved it!

  • @continnum6540
    @continnum6540 Год назад +1

    Wow 🔥🔥🔥

  • @mcyz7871
    @mcyz7871 3 года назад +1

    awesome job

  • @HylianEvil
    @HylianEvil 3 года назад +7

    If only the next video was out so I could make a Markov chain to predict when the next video will be out

    • @DrTrefor
      @DrTrefor  3 года назад +3

      Haha, well past behavior indicates I release a lot of videos on Monday’s, but you can’t look at that unless you model with a non-Markov process;)

  • @Salvador-xy5es
    @Salvador-xy5es Год назад

    maan you are awsome THANK YOU!!!

  • @giovanniberardi4134
    @giovanniberardi4134 2 года назад +2

    I really appreciated your video. I have a question: in the market example with a drift rate of zero the transition probabilities would have been even? I mean 50 % probabilities of transition from bull to bear and vice versa? Thank you very much

  • @thetutorialdoctor
    @thetutorialdoctor 6 месяцев назад

    Excellent

  • @OfferoC
    @OfferoC 3 года назад +1

    Awesome thanks

  • @othmanaljbory3649
    @othmanaljbory3649 2 года назад

    عاشت الايادي استاذ شكرا جزيلا بالتوفيق والنجاح

  • @joicet5230
    @joicet5230 2 года назад

    nice explanation for a beginner like me

  • @sebaaismail1951
    @sebaaismail1951 3 года назад +1

    Thank you dr, yours vidéo are usefull.

  • @muhammadneanaa1611
    @muhammadneanaa1611 3 года назад +1

    Thanks!

  • @Tiredprincessss
    @Tiredprincessss 3 года назад

    THANK YOU

  • @ungoyboy2006
    @ungoyboy2006 3 года назад

    Thanks for video , it was mentioned Markov only based on “present” state however the transition probabilities themselves are based on historical data right? Just trying to get my head around that distinction.

  • @thepresistence5935
    @thepresistence5935 2 года назад +4

    Please don't go to MIT, Standford at all, be here. We need you.

  • @ccuuttww
    @ccuuttww 3 года назад +1

    You may also talking about how to find the stable status with linear algebra PDP^-1 which related to your series

    • @DrTrefor
      @DrTrefor  3 года назад

      Indeed, the next video is going to cover the connection to linear algebra although I won’t get to diagonalization for a while yet

  • @mattpecevich3749
    @mattpecevich3749 3 года назад +5

    What a great video, thank you!!
    Question: in your example of the stock market I think you used historical data to generate those bull-bull and bear-bear probabilities. Is it still a Markov process if the probabilities on the tree are derived from the past?

    • @DrTrefor
      @DrTrefor  3 года назад +9

      Ah great point, and it takes a bit of further consideration of what do we REALLY mean about "the past". It is more about ignoring the recent past. So I'm not looking at least week or a month ago in my predictions for next week. But your are right that while I made up the numbers in this example, they would have come from looking at some historical average over, say, decades.
      A similar example might be weather. A markov model might be constructed that says given the weather today, what is the probability of the weather tomorrow, and it would be markov because it ignores what the weather was like yesterday or a week ago. However, the model might still build in historical climate data about what the weather is like generally.

    • @mattpecevich3749
      @mattpecevich3749 3 года назад +1

      Dr. Trefor Bazett thanks, that makes sense! Looking forward to the next one!

  • @georgesalexandrebajk8223
    @georgesalexandrebajk8223 3 года назад

    Thank you Dr very well explained 👌
    Do you have any videos about poison process and exponential distribution

    • @DrTrefor
      @DrTrefor  3 года назад

      Thank you! Yes I do plan to do a stats series at some point, but not just yet sorry:)

  • @sukanthenss914
    @sukanthenss914 3 года назад +1

    Hey @Dr. Trefor Bazett Is this the Basics for Reinforcement Learning !!

  • @admiralhyperspace0015
    @admiralhyperspace0015 3 года назад +1

    Can't appreciate enough your videos. Plz keep making them, and I am second. hehehe

    • @DrTrefor
      @DrTrefor  3 года назад

      Will do! 2nd is still pretty good, haha:D

  • @bonecircuit9123
    @bonecircuit9123 10 месяцев назад

    what would be the differences to a finite state machine and the states position?

  • @sambhavgupta4653
    @sambhavgupta4653 11 месяцев назад

    So basically we can consider markov chains as studying dependent probabilities?
    Also, how is is bear and bull example markovian? As you mentioned, that using an old data set (past info.) is non markovian process.

  • @jace3789
    @jace3789 2 года назад +1

    Did you do the Transition Matrix video from the 'coming soon' note ? Thanks

    • @DrTrefor
      @DrTrefor  2 года назад +1

      Yup, should be in the discrete math playlist

  • @Sam-fv4xq
    @Sam-fv4xq 3 года назад +1

    better than my teacher

  • @ayamohammed4355
    @ayamohammed4355 2 года назад

    A clear explanation, please, how can I contact you because I have some questions. Is there an email?

  • @twishanuaichroy1938
    @twishanuaichroy1938 5 месяцев назад

    Best

  • @kdewanik
    @kdewanik 3 года назад +3

    You are so amazing Dr.Trefor making all these heavy content accessible to everyone, I give my best wishes to the channel growth exponentially. 🎊😇 I am doing my best to promote this content to everyone !!

    • @DrTrefor
      @DrTrefor  3 года назад

      Thanks for your help Dewanik, really appreciate it!

  • @SHAHHUSSAIN
    @SHAHHUSSAIN 3 года назад +1

    Crown of mathematics🥰🥰💝💝♥️♥️

    • @DrTrefor
      @DrTrefor  3 года назад

      haha, thank you Shah!

  • @tuongnguyen9391
    @tuongnguyen9391 3 года назад +1

    So what kind of playlist would this Markov Chain belong to ?

    • @DrTrefor
      @DrTrefor  3 года назад

      Right now it’s in my discrete math playlist, but anything with probability or stats could talk about this.

  • @jacobfertleman1980
    @jacobfertleman1980 Год назад

    I desperately need to see a movie with the character he described: A superhero with an hour memory span😂

  • @CamEron-nj5qy
    @CamEron-nj5qy Год назад +2

    A superhero who can't fly but has a cape 😂

  • @binshebah
    @binshebah 3 года назад

    In the final step, why did you multiply the probabilities of each branch and not adding them ?

    • @mryup6100
      @mryup6100 2 года назад

      I would like to know as well. I don't know the logic behind it.

  • @physicslover1950
    @physicslover1950 3 года назад +1

    Please make videos on vector calculus i.e. the calculus of vector fields. Please sir. There are no sources on RUclips about this topic. Sir we don't know about any android software that can help us plot vector fields.

    • @DrTrefor
      @DrTrefor  3 года назад +1

      It's coming, starting in about 2 weeks!

    • @physicslover1950
      @physicslover1950 3 года назад

      @@DrTrefor Would you you recommend me a software on Android that can plot vector fields.

    • @DrTrefor
      @DrTrefor  3 года назад

      I would try navigating to wolframalpha on browser first I think, but I’m sure there are others

  • @BlackCodeMath
    @BlackCodeMath 3 месяца назад

    Which came first, the Markov Chain or the DFA/NFA?

  • @DrewCocker
    @DrewCocker 5 месяцев назад

    Why are the superheroes randomly moving around the city?

  • @aashsyed1277
    @aashsyed1277 2 года назад

    mark OV chain

  • @ronaldmarcks1842
    @ronaldmarcks1842 6 месяцев назад

    This is counter-intuitive, the notion that experience has no value. Thanks.

  • @metalhamster14
    @metalhamster14 Год назад

    shame about the audio

  • @twilight1176
    @twilight1176 Год назад

    I dont think thats a cool superhero