Long Short-Term Memory (LSTM), Clearly Explained

Поделиться
HTML-код
  • Опубликовано: 26 янв 2025

Комментарии • 1,3 тыс.

  • @statquest
    @statquest  2 года назад +46

    To learn more about Lightning: lightning.ai/
    NOTE: Since LSTM is a type of neural network, we find the best Weights and Biases using backpropagation, just like for any other neural network. For more details on how backpropagation works, see: ruclips.net/video/IN2XmBhILt4/видео.html ruclips.net/video/iyn2zdALii8/видео.html and ruclips.net/video/GKZoOHXGcLo/видео.html The only difference with LSTMs is that you have to unroll them for all of your data first and then calculate the derivatives. In the example in this video, that means unrolling the LSTM 4 times (as seen at 17:49) and calculate the derivatives for each variable, starting at the output, for each copy and then add them together.
    ALSO NOTE: A lot of people ask why the predictions the LSTM makes for days other than day 5 are bad. The reason is that in order to illustrate how, exactly, an LSTM works, I had to use a simple example, and this simple example only works if it is trained to predict day 5 and only day 5.
    Support StatQuest by buying my books The StatQuest Illustrated Guide to Machine Learning, The StatQuest Illustrated Guide to Neural Networks and AI, or a Study Guide or Merch!!! statquest.org/statquest-store/

    • @enggm.alimirzashortclipswh6010
      @enggm.alimirzashortclipswh6010 Год назад +3

      you deserve a a shamless promotion for this lecture dude.

    • @statquest
      @statquest  Год назад +1

      @@enggm.alimirzashortclipswh6010 Thank you! :)

    • @NicitoStaAna
      @NicitoStaAna 3 месяца назад

      Yow, a new paper dropped
      "Were RNNs All We Needed?"
      I feel like this needs an update

    • @arunkennedy9267
      @arunkennedy9267 2 месяца назад

      Please make a Video on Backpropagation through time. With for RNN/LSTM.

    • @statquest
      @statquest  2 месяца назад

      @@arunkennedy9267 It's in my new book that will come out in early January.

  • @ranjitgopi7519
    @ranjitgopi7519 Год назад +41

    Thanks StatQuest for everything!

    • @statquest
      @statquest  Год назад +13

      TRIPLE BAM!!! Thank you so much for supporting StatQuest!!!

  • @drranjitha
    @drranjitha Год назад +344

    "At first I was scared of how complicated the LSTM was, but now I understand."
    "TRIPLE BAM!!!"
    Thanks Dr. Starmer for teaching in a way I could follow. I am placing an order for your book today.

    • @statquest
      @statquest  Год назад +19

      Hooray!!! Thank you very much! :)

  • @kylek29
    @kylek29 2 года назад +330

    As someone who has watched a ton of videos on these topics, I can say that you probably do the best job of explaining the underlying functionality in a simple to follow way. So many other educators put up the standard flowchart for a model and then talk about it. Having the visual examples of data going in and changing throughout really helps hammer the concept home.

    • @statquest
      @statquest  2 года назад +12

      Thank you very much!

    • @suzhenkang
      @suzhenkang 2 года назад +7

      great great great great great great great great great great great great great great video!

    • @statquest
      @statquest  2 года назад +7

      @@suzhenkang Thank you very much! :)

    • @johnyeap7133
      @johnyeap7133 2 года назад +1

      yes definitely

    • @chenchen4619
      @chenchen4619 Год назад +5

      @@statquest also cool sound effects

  • @robertpolevoi8630
    @robertpolevoi8630 Год назад +148

    Some rare teachers have instant cred. The moment they start talking you are convinced they really understand the subject and are qualified to teach it. As an experienced teacher of extremely challenging tech myself, I confess that I've never seen more complete and polished preparation. You are changing people's lives at just the moment when this is so critical. Best of everything to you.

    • @statquest
      @statquest  Год назад +11

      Thank you very much! I really appreciate it.

  • @muhammadzakiahmad8069
    @muhammadzakiahmad8069 Год назад +138

    2 years into Data science, many paid an unpaid courses, never understood the underlying functionality of LSTM but today, Thank you Mr. Josh Starmer for being in my life.

  • @SY-jh3tg
    @SY-jh3tg Год назад +41

    I have been working in ML industry of 5 years now. But I never had this clear understanding. Not only have you explained this clearly but also sparked a curiosity to understand everything with this much clarity. Thanks Josh!!

  • @nicolasfonteyne7367
    @nicolasfonteyne7367 5 месяцев назад +3

    Your teaching technics are just magical! Keep up with this amazing job! BAM!!!

    • @statquest
      @statquest  5 месяцев назад

      TRIPLE BAM! Thank you so much for supporting StatQuest!

  • @khoile1269
    @khoile1269 Год назад +17

    For a beginner, your videos make me feel easy to follow and understand. I love the way that you use the visual example with different colors so that it easier to follow. And the curiosity to learn more is the thing that makes your videos really impressive to me. Thank you, Josh!

  • @PuroCyanHQ
    @PuroCyanHQ 2 года назад +31

    I have been waiting for your LSTM video for so long! No other videos can explain ML concepts as good as you do, you sir deserve a thousand BAMs!!

    • @statquest
      @statquest  2 года назад +2

      Thank you very much! BAM! :)

  • @gabrielplzdks3891
    @gabrielplzdks3891 2 года назад +66

    The ease with which you explain these topics has inspired me to pursue a masters in data science. Thank you for helping me unveil my passion.

    • @rathnakumarv3956
      @rathnakumarv3956 2 года назад +1

      One can learn any thing for passion. should invest money and time learning in only employable courses.

    • @statquest
      @statquest  2 года назад +14

      BAM! :)

  • @hemesh5663
    @hemesh5663 Год назад +30

    Mad respect for putting in the hours to prepare the material for the course. These topics are some of complicated ones and yet your illustration + explanation + awesome songs make it easy and enjoyable.

    • @statquest
      @statquest  Год назад +2

      Thank you very much! :)

  • @dikshantgupta5539
    @dikshantgupta5539 Год назад +6

    I have never fully understood the working of LSTM and tried many blogs and videos on it until I watched your video. This is by far the best explanation on LSTM I have seen on internet. Thank you so much for putting so much of hard work in creating these types of videos.

  • @crystalcleargirl07
    @crystalcleargirl07 8 месяцев назад +9

    the first stage of LSTM unit determines what percentage of long term memory is remembered ... you are absolutely amazing!

  • @amirrezalotfy
    @amirrezalotfy Год назад +5

    I read and watched various articles and videos about LSTM, but none of them explained as well and simply as you. I learned a lot from your video and I am indebted to you. Thank you for taking the time to make this video. I hope you are always healthy and happy. BAM :)

  • @deathstalkr_
    @deathstalkr_ 7 месяцев назад +3

    Watched a lot of videos on LSTM, but BAM…this was the one. I really liked the part where you put in real numbers and showed the calculations taking place. This really helped me get in the perspective of the inner workings. Kudos for putting in the hard work in making this amazing video.

    • @statquest
      @statquest  7 месяцев назад +1

      Thank you! :)

  • @Cld136
    @Cld136 Год назад +2

    Thanks you! This is so intuitive.

    • @statquest
      @statquest  Год назад +1

      TRIPLE BAM!!! Thank you so much for supporting StatQuest! :)

  • @be_cracked8212
    @be_cracked8212 Год назад +16

    Damn, these are very well explained. The somewhat silly humor isn't quite for me, but with these high quality explanations I couldn't care less about that. Great job!

  • @willw4096
    @willw4096 Год назад +2

    Thanks!

    • @statquest
      @statquest  Год назад +1

      TRIPLE BAM!!! Thank you so much for supporting StatQuest!!! :)

  • @yyyzzz-k3r
    @yyyzzz-k3r Год назад +4

    This is the best tutorial so far. Thank you for your clearly explanation! I watched every episode of your NN series. I am a CS student and building a voice cloning app for my honour project. Your tutorials are truly helpful!!!

    • @statquest
      @statquest  Год назад

      Thanks! I'm really glad my videos are helpful! :)

  • @amitpraseed
    @amitpraseed 11 месяцев назад +2

    Are these topics covered in "The StatQuest Illustrated Guide to Machine Learning"? The video is hands down the BEST explanation of LSTM I have seen anywhere!!!

    • @statquest
      @statquest  11 месяцев назад +1

      The chapter on neural networks does not cover LSTMs. Just the basics + backpropagation.

    • @amitpraseed
      @amitpraseed 11 месяцев назад

      @@statquest Would you consider a book explaining deep learning concepts? It would be a major life saver for all of us

    • @statquest
      @statquest  11 месяцев назад +1

      @@amitpraseed I'm, slowly, working on one.

  • @saeethegreat1672
    @saeethegreat1672 Год назад +5

    BEST TEACHER EVER! seriously how do you make such complicated matters so simple and easy to understand? You're amazing and even tho I didn't plan on learning Machine learning, I'm soooo gonna watch every last video on this channel! thank you for this!

  • @billy.n2813
    @billy.n2813 Год назад +2

    Thank you very much sir. I was hired as an undergraduate research assistant earlier this year, and took this opportunity to discover and learn about Deep Learning. I am currently learning about RNNs, and this video was of great value to me.
    Thank you very much for this.

  • @mahu1203
    @mahu1203 Год назад +4

    Thank you you are the best teacher, I have seen. 🎉🎉🎉 Hurray. I learn from you than from my actual teachers who just waste my time and break my nerves down ....
    Thank you 🙏🙏🙏🙏🙏🙏

    • @statquest
      @statquest  Год назад

      I'm glad my videos are helpful! :)

  • @TodoProcesos
    @TodoProcesos 6 месяцев назад +1

    A huge thanks for all the effort that you put on this series of videos. You are changing the course of people’s life.

  • @sayakbhattacharya9188
    @sayakbhattacharya9188 2 года назад +10

    Man the timing! I just saw your RNN video yesterday and was waiting for your LSTM video. Your timing is just impeccable

  • @krapukhin
    @krapukhin 2 года назад +3

    Josh, your videos and book have been an incredible discovery for me. The visual explanation is much easier to understand. Thank you!

    • @statquest
      @statquest  2 года назад +1

      Thank you very much! :)

  • @gloriasegurini3957
    @gloriasegurini3957 11 месяцев назад +1

    I'm currently studying for my NLP exam. Sending all my gratitude from Italy for such a clear and in-depth explanation.

    • @statquest
      @statquest  11 месяцев назад

      Good luck! BAM! :)

  • @dallashutchinson3783
    @dallashutchinson3783 2 года назад +7

    Such a wonderful explanation. Have been learning about LSTMs in my course but finally understand how it works now. Looking forward to the next step of the Quest!

  • @xiaoHuanyu1027
    @xiaoHuanyu1027 11 месяцев назад +1

    i think this is the most clear explanation of LSTM i've ever seen. i've watched many other videos that teaches LSTM, but none of them made me feel so clear about this thing!

  • @aayushjariwala6256
    @aayushjariwala6256 2 года назад +20

    Requested a video on NLP some time ago and here StatQuest with a better explanation than I expected! (Other RUclipsrs and Courses taught me 'How LSTM works?' but your explanation taught 'Why LSTM works?' The clarification between sigmoid and tanh solved many of my questions)

  • @alisaghi051
    @alisaghi051 2 месяца назад +1

    Thank you Josh! Watching this animation increased my respect for Jürgen Schmidhuber for thinking of and implementing this beautiful model, and for you who made it so easy to understand.

  • @Rick-Chen
    @Rick-Chen 2 года назад +3

    This is the best video to clearly explain the concept of LSTM I have ever seen!

  • @rishavmishra8110
    @rishavmishra8110 6 месяцев назад +1

    Can't find a better explanation than this of LSTMs across RUclips, Thanks Josh !

  • @viveksundaram4420
    @viveksundaram4420 2 года назад +10

    How does this channel not have 100 million subscribers already? What a beautiful content. Love the way things are presented.

  • @javadrahmannezhad9908
    @javadrahmannezhad9908 2 года назад +2

    Definitely the best visual explanation of LSTM I have ever seen!! Can't wait for your video for Transformers.

    • @statquest
      @statquest  2 года назад

      Glad you liked it!

    • @javadrahmannezhad9908
      @javadrahmannezhad9908 2 года назад +2

      @@statquest Do you have any timeline in mind for a video on Transformers?

  • @RajkumarDarbar
    @RajkumarDarbar Год назад +3

    I can't go to the next video without saying a big thanks to you here !! loved this explanation !! 👏

  • @markhywang
    @markhywang Месяц назад +1

    I am a current university student and that was probably the best explanation of LSTMs that I ever came across...you deserve my like and sub. I love the humour as well :)

  • @Bramsmelodic
    @Bramsmelodic Год назад +4

    As usual, the best intuitive explanation i have seen for LSTM till now! I have banged my head on this topic in thousands of literature and videos who try to explain the same block diagrams over and over.. I got frustrated beyond a certain point. Thankfully Josh made this.. Atleast from a concept wise I am clear now..What Josh does to the community is commendable..

  • @joannerizkallah176
    @joannerizkallah176 Месяц назад +1

    statquest saving me again before final exams. I see statquest, I click. Thank you!

  • @yashsonune4391
    @yashsonune4391 Год назад +3

    Can't thank you enough! the dedication you put in this video is amazing. You are my guru (or shall I call it Yoda).

  • @sinaro93
    @sinaro93 2 года назад +3

    Just WOW! Can't wait to see the third part of this series (Transformers). Thank you, Josh.

  • @chandrachalla3466
    @chandrachalla3466 Год назад +1

    This is an awesome video to explain LSTM. I have a little knowledge about LSTM (I felt that is required to understand this video) - but you made it really clear and eloquent. Your voice is perfect and clear. Hats off !! Thank you so much

  • @leon9413exe
    @leon9413exe 2 года назад +3

    Use simple words to help me understand some complex concepts! Really appreciated that! Looking forward to learning Transformers from you soon!

  • @TomM-p3o
    @TomM-p3o Год назад

    I clicked like before watching the video but after 5 seconds of scrolling through the visualizations.
    You have some of the best visualizations on this topic on RUclips.
    Glad I found your channel.

  • @ashfaqueazad3897
    @ashfaqueazad3897 2 года назад +209

    Your videos should begin with "universities hate this guy, learn how you increase your knowledge with Josh" 😂

    • @statquest
      @statquest  2 года назад +43

      BAM! :)

    • @computerconcepts3352
      @computerconcepts3352 2 года назад +22

      The university I'm in is hiring new professors, I wish Josh Starmer is my statistics professor, lol

    • @Ragnarok540
      @Ragnarok540 2 года назад +15

      Universities actually love they can use these videos as material.

    • @juanpabloaschieri6336
      @juanpabloaschieri6336 2 года назад +11

      Litterally got recomended the channel by my professor, so i would say that necessarilly

    • @picassoofai4061
      @picassoofai4061 2 года назад +9

      @@Ragnarok540 Yes, they can afford giving poor classes because students will study on their own. Essentially making degrees useless.

  • @vrajmalvi7194
    @vrajmalvi7194 Год назад +1

    best explanation ever, i can't express how glad I am to found this channel. 100% better than the paid course I am doing right now. Thank you :).

  • @felipeazank3134
    @felipeazank3134 Год назад +8

    This weekend I'll try going to church to thank god for your existence Josh, seriously

  • @FullStackAmigo
    @FullStackAmigo Год назад +1

    This was the best video about LSTMs that I've ever seen! Thanks!

  • @DavidWalker-ko6po
    @DavidWalker-ko6po Год назад +3

    An exceptional explanation! I finally understand LSTMs after 6 months of trying to get my head around them! Thank you so much.

  • @brentcos9370
    @brentcos9370 Год назад +1

    Dr. Starmer, you're a rockstar! Your videos are a life-saver. I use your videos as supplementary training as I go through other ML/DL/AI courses. The visualizations are amazing and your explanations are equally amazing. 😎👊

  • @MuddyRavine
    @MuddyRavine 2 года назад +9

    I'm taking an NLP class, we learned about LSTMs a couple weeks ago. I have already forgotten much. This was a very clear and well illustrated example of how they work. Hopefully the percentage of what I now know about LSTMs that is added to my long term memory is now approaching one. Thank you! I'm waiting, with great attention, for the transformer video!!!

    • @statquest
      @statquest  2 года назад

      Awesome! I'm glad to hear the video was helpful! :)

    • @otsogileonalepelo9610
      @otsogileonalepelo9610 2 года назад +2

      Attention is all you need 😃😃

    • @statquest
      @statquest  2 года назад

      @@otsogileonalepelo9610 :)

  • @parthmangalkar
    @parthmangalkar Год назад +1

    This tutorial was too good!!
    Now I clearly know how LSTMs work, and how they are used to solve the Vanishing/Exploding Gradient Descent problem.
    Thank you StatQuest!!

  • @ashishj9497
    @ashishj9497 Месяц назад +5

    You are literal GOATTT

  • @Lokendrasingh-gq4xp
    @Lokendrasingh-gq4xp 5 месяцев назад +1

    "Simplicity is the ultimate sophistication". Wonder how many hours go in to make the explanation so simple and smooth.
    Great work!! Thank you!!

    • @statquest
      @statquest  5 месяцев назад

      Thank you very much! I do spend a lot of time on these, but I enjoy it.

  • @plafle-zi7mi
    @plafle-zi7mi 11 месяцев назад +3

    Thanks so much! It is really the best tutorial! However, I do have a tiny problem. I really appreciate it if you could give me some help.
    In 18:00, when using LSTM to predict company A, you said that the final short-term memory represents the predicted price on day 5. So, when you input the price on day 3 (which is 0.25), the short-term memory (which is -0.2) should represent the predicted price on day 4. However, the real price on day 4 is 1. There seems to be some problems.

    • @statquest
      @statquest  11 месяцев назад +1

      In order to keep this example as simple as possible (so I could illustrate it), this model was only trained to predict the value on day 5. It wasn't trained to predict the value on day 3 or any other day.

    • @plafle-zi7mi
      @plafle-zi7mi 11 месяцев назад

      Got it. Thanks for the swift and helpful reply! I'm truly grateful for your help and your tutorial!@@statquest

    • @ajaygupta6034
      @ajaygupta6034 Месяц назад

      @@statquest I think you trained the model completely but the data was somehow less for the lstm to get an accurate prediction for day 3? anyway great explanation!!

    • @statquest
      @statquest  Месяц назад

      @@ajaygupta6034 The model was never trained to get an accurate prediction for day 3. It was only trained to make predictions on day 5.

    • @ajaygupta6034
      @ajaygupta6034 Месяц назад

      @@statquest okayy, thanks for clearing that! What would we change to train the model to get an accurate prediction on day 3 ?

  • @hivusim
    @hivusim 11 месяцев назад +1

    😍Thanks!

    • @statquest
      @statquest  11 месяцев назад

      TRIPLE BAM!!! Thank you so much for supporting StatQuest! :)

  • @shoto6018
    @shoto6018 2 года назад +4

    BAAAM, First. Gotta Thank Professor Josh before I even watch the video

  • @NJCLM
    @NJCLM Год назад +2

    Thank you so much! I became a payed member. I wish you great success in your passion for teaching.

  • @ttominable
    @ttominable 2 года назад +3

    Im a native spanish speaker and when this video played speaking spanish my face was genuine horror, mainly because im used to josh’s voice. I’m glad i could switch it back

    • @statquest
      @statquest  2 года назад +1

      Ha! That's funny. Well, to be honest, one day my dream is to record my own Spanish overdubs. I'm still very far away from that dream coming true, but maybe one day it will happen.

  • @adityarajora7219
    @adityarajora7219 Год назад +2

    I have no clue why on earth such content is FREE!

  • @sidverma1888
    @sidverma1888 2 года назад +4

    Thank you professor, you are the best!

  • @MCMelonslice
    @MCMelonslice 2 года назад +2

    Gosh, Josh. You make learning such a breeze. Thank you very much for every single BAM!

  • @jinchengliu9586
    @jinchengliu9586 2 года назад +3

    Amazingly explained! Can't wait to watch transformer!

  • @shadowlynx2624
    @shadowlynx2624 6 месяцев назад +1

    THIS IS THE BEST VIDEO TO UNDERSTAND LSTM!! KEEP UP THE GREAT WORK!!!

    • @statquest
      @statquest  6 месяцев назад

      Thanks, will do!

  • @saikrishna-ie1xf
    @saikrishna-ie1xf Год назад +3

    great series of videos, please make some for "transformers" too!! Thanks in advance

  • @marisa4942
    @marisa4942 Год назад

    Thank you so much for continuing to upload videos on this machine learning topic!! Your videos has saved my grade a year ago and now it has helped my team members understand the concept very easily!

  • @clemmensenwilliamson6520
    @clemmensenwilliamson6520 9 месяцев назад +1

    Man!.. This video is awesome! .All of your videos are awesome! You are awesome!!!

    • @statquest
      @statquest  9 месяцев назад

      Glad you like them!

  • @manonarasimha3913
    @manonarasimha3913 8 месяцев назад +1

    You explain the concepts extremely well and in a simple manner! Thank you very much!

  • @high_fly_bird
    @high_fly_bird Год назад +1

    omg i am so fond of these videos! Thank you so much for doing it!

  • @Murattheoz
    @Murattheoz Год назад +1

    Your videos always puts a smile on my face, while learning.

  • @PremKumar-ym3vh
    @PremKumar-ym3vh Год назад +1

    This what Teaching should be. I have tried watching a bunch of videos in youtube almost all of them were technical jargon. Didnt understand the why part!
    Thank you Dr. Starmer for making such videos

  • @beulahnarendrapurapu2029
    @beulahnarendrapurapu2029 Год назад +1

    You are the bestest teacher I've ever seen. I am a teacher my self, currently also a student taking AI course.

  • @viratzz
    @viratzz 10 месяцев назад +1

    Truly a magical way of explaining such complex topics!

  • @animalside999
    @animalside999 5 месяцев назад +1

    amazing. The best LSTM explanation possible. Period.

  • @vishnuprasadj6511
    @vishnuprasadj6511 2 года назад +1

    Finally, I understood LSTM. Clean and simple explanation. Probably the best one about LSTM.

  • @emircagr3154
    @emircagr3154 Год назад +1

    I am making a comment on a video after a really really long time. And for me this is the best criteria to show myself how useful this video is. Thanks :)

  • @sohamborkar2117
    @sohamborkar2117 9 месяцев назад +1

    It was the most easy explaination of LSTM ever, Thank You So Much...

  • @Jjjj24971
    @Jjjj24971 2 года назад +2

    I have watched almost all of your videos from the beginning ... I found that your teaching skills and visualization skills become better and better in every single video. This is the best quality of a data scientist, which I do not find in many data scientists.

  • @anusha6033
    @anusha6033 2 месяца назад +1

    I must say i have never understood a concept so well, thankyou so much Dr.Starmer . TRIPLE BAM!! Will come back here to learn about transformers next.

  • @VORBILDER
    @VORBILDER Год назад +1

    Brilliant explanation, Josh!

  • @kc66
    @kc66 2 года назад +2

    Hands down the best explanation for LSTM!

  • @pb9405
    @pb9405 2 года назад +1

    Your videos are very accessible, I love them. I'll definitely recommend them when I'm asked for introductions to Machine Learning content.

    • @statquest
      @statquest  2 года назад

      Awesome! Thank you! :)

  • @parisahormozzade-r6o
    @parisahormozzade-r6o 11 месяцев назад +1

    Wow! These videos are absolutely incredible! What a presentation!

  • @scihistoryfusion
    @scihistoryfusion 2 года назад +1

    So far the best explanantion for LSTM. Thanks a lot for this video. EAgerly waiting for the next stage 'Transformer'.

  • @sagartalagatti1594
    @sagartalagatti1594 5 месяцев назад +1

    This was a GOD level explanation of LSTM!!! Hats off!!!

  • @yousufahmed985
    @yousufahmed985 10 месяцев назад +1

    HOLY SMOKES the concept is now crystal clear 🔥🔥🔥🔥🔥

  • @ouedraogoamisamyra2799
    @ouedraogoamisamyra2799 Год назад +1

    clear and straight explanation. So far, this is the best explanation I have seen. Is a video clearly explaining GRU (Gated Recurrent Unit) concepts?

    • @statquest
      @statquest  Год назад

      Thanks! I don't have a video on GRU yet.

  • @mateuszsmendowski2677
    @mateuszsmendowski2677 Год назад +2

    The best explanation how LSTM cell works.

  • @SarveshRansubhe
    @SarveshRansubhe 2 года назад +2

    You are insane!!!!!!!!!!!!!!!!!!!!!!!!!
    never thought I would understand lstm by looking at diagram but you did.

  • @leejo5160
    @leejo5160 Год назад +1

    The best explanation on LSTM ever! Thank you so much!

    • @statquest
      @statquest  Год назад

      Glad it was helpful!

    • @leejo5160
      @leejo5160 Год назад

      @@statquest Thank you for your reply, Josh. One thing I am a little confused about is the difference between short term memory and prediction. At around 18:00 when you explain the day 5 prediction, you said that the final short term memory is day 5 prediction. Does that mean the input value is the actual price at a certain date and the short term memory is the price prediction at a certain date. If that's the case, then the short term memories should be very close to the input values but they are not (0, -0.1, -0.1 ,-0.2 vs 0, 0.25, 0.5, 1).

    • @statquest
      @statquest  Год назад

      @@leejo5160 The model was only trained to predict the output on day 5. And, as such, only makes good predictions for day 5. However, we could train it to predict every day if we wanted to. We'd probably need more data or a more complicated model (more layers or a fully connected network at the end).

  • @MishoQ
    @MishoQ 8 месяцев назад +1

    Great video! It is super easy to watch and understand!
    Also, it would be really helpful if you made a video where you clearly explain the backpropagation in LSTM. Because there are almost no reliabile and understandable videos on this topic on youtube...
    Thank you!
    Edit: just saw your pinned comment with all the stuff about backpropagation in LSTM, so thanks again :)

    • @statquest
      @statquest  8 месяцев назад

      bam! :)

    • @MishoQ
      @MishoQ 8 месяцев назад

      @@statquest By the way, I wanted to ask you, regarding backpropagation in LSTM. At first, I wanted to do some math to calculate each weight's/bias derivative (like you said in pinned comment), but then, soon after, found out that it would take a long time, since formulas get more complicated the dipper you go. So, I decided to calculate the derivative of the error of the output of the last LSTM (its STM) with respect to the output of the previous one, so that I get the "gradient" of previous LSTM's STM, and then calculate all the local derivatives for all weights and biases (and repeat this algorithm for each time step). Thus, using those gradients I can just localy calculate all the derivatives without any problem for each previous LSTM block and then finally add them together and adjust the weights and biases. Is it correct to do so?

    • @statquest
      @statquest  8 месяцев назад

      @@MishoQ To be honest, I'm having a hard time imagining exactly how your process works. A much easier approach would to just do a proof of concept gradient using a very simple, vanilla RNN like this: ruclips.net/video/AsNTP8Kwu80/видео.html In that case, it's much easier to calculate everything and you can validate that your method is correct.

    • @MishoQ
      @MishoQ 8 месяцев назад

      @@statquest Well, yes, you are right, it would be great to try to prove it with some smaller models. But it seems like this method for LSTM is not really applicable to a vanilla RNN, since while LSTM blocks are fully used in calculation for each time step (each weight and bias is taken into account), in vanilla RNN we skip some of them (like w3 and b2 in that RNN from your example in the video you mentioned) and use only in the last iteration. However, I think there is a method, and the idea is worth a try. I guess I will start from the RNN then, and after that come back to LSTM (hopefully with new ideas and better understanding). :)
      In general, thank you for your responses. In my opinion the fact that you respond to each comment and help people with their questions is super cool. And you really deserve a huge respect for that. Keep doing the best! :)
      Edit: I decided to write the LSTM from scratch (on C++) with this method of backpropagation to see how it works, and after some hyperparameters tuning and training it perfermed quite great (92% precision) on simple task (similar to the task in your video). Although I did not manage to find any detailed and accessible sources that explain backpropagation in LSTM, I guess that this method is likely correct, given the fact that the network actually showed the result. :))

  • @hey-its-me239
    @hey-its-me239 Год назад +1

    Thank you so much for your videos! I never knew my brain could handle such THE COMPLICATED CONCEPTS!
    TRIPLE BAM!!!!!!!!!
    🥰😍🤩

  • @er.shashikantkumar9584
    @er.shashikantkumar9584 2 года назад +1

    Sir, this is my extremely awaited topic.A lot of Thanks. i know after watching LSTM my doubt will be clear.

  • @drramasubramaniam6724
    @drramasubramaniam6724 Год назад +2

    Excellent introduction to LSTMs. Thank you Josh

  • @SelfBuiltWealth
    @SelfBuiltWealth 5 месяцев назад +1

    Hello Mr.Josh i hope you are reading this. I noticed that year by year as time progresses, the amount of things to learn is exponentially rising, for example if we go back to 1993 there was no such thing as RNNs and machine learning was so small as a domain, in just 2 decades we found these complexe algorithms ; our brains need to keep up down the line our grand grandchildren will have to learn even more to just arrive to the "ok now i know everything that we discovered in this field". So im so worried about the future of humans they will have to learn more than us by so much. Thats why i wrote this comment to thank you from the bottom of my heart and tell you that your work is THE MOST essential to humans progression, with you mr.josh you revolutionazied imo the way of learn Deep learning and ML. We need more people like you to evolve our ways of learning. Thank you soo much and TRIPLE BAM :)❤❤❤❤❤

    • @statquest
      @statquest  5 месяцев назад

      Wow! Thank you very much! I'm really glad you like my videos. :)

  • @i9erek
    @i9erek 2 месяца назад +1

    That is the best explanation I've seen anywhere!

  • @shahriyarharis6752
    @shahriyarharis6752 11 месяцев назад +1

    I seldom comment on videos, but credit lies where it's due. Hands down the best video on LSTM I have watched

  • @aleksszukovskis2074
    @aleksszukovskis2074 Год назад +2

    congrats on 1M subscribers

  • @sanasaleh2750
    @sanasaleh2750 9 месяцев назад +1

    Really grateful for your dedication..Perfect video like always..Yet I missed the last part in which you revise the whole concept

    • @statquest
      @statquest  9 месяцев назад

      I miss those parts too. But very few people watched them. :(

    • @sanasaleh2750
      @sanasaleh2750 9 месяцев назад +1

      @@statquest 😥.. Those 2 mins are really helpfuk in grasping the essence of the complete video. Btw, I am glad I found your channel to make me understand the real stuff without getting confused in complex mathematical notations.