LSTM explained simply | LSTM explained | LSTM explained with example.

Поделиться
HTML-код
  • Опубликовано: 7 фев 2025
  • LSTM explained simply | LSTM explained | LSTM explained with an example
    #lstm #machinelearning #deeplearning #ai
    Hello,
    My name is Aman and I am a Data Scientist.
    All amazing data science courses at the most affordable price here: www.unfolddata...
    Book one on one session here(Note - These supports are chargable): docs.google.co...
    Follow on Instagram: unfold_data_science
    Topics for this video:
    LSTM explained simply,
    LSTM explained,
    LSTM explanation,
    LSTM explained with example,
    LSTM explained medium,
    lstm for stock prediction,
    lstm time series prediction,
    lstm pytorch,
    lstm tensorflow,
    lstm vs transformer,
    lstm neural network tensorflow,
    lstm model for time series prediction,
    long short term memory,
    lstm for anomaly detection,
    lstm for chatbot,
    unfold data science deep learning,
    unfold data science neural network,
    About Unfold Data science: This channel is to help people understand the basics of data science through simple examples in an easy way. Anybody without prior knowledge of computer programming or statistics or machine learning and artificial intelligence can get an understanding of data science at a high level through this channel. The videos uploaded will not be very technical in nature and hence can be easily grasped by viewers from different backgrounds as well.
    Book recommendation for Data Science:
    Category 1 - Must Read For Every Data Scientist:
    The Elements of Statistical Learning by Trevor Hastie - amzn.to/37wMo9H
    Python Data Science Handbook - amzn.to/31UCScm
    Business Statistics By Ken Black - amzn.to/2LObAA5
    Hands-On Machine Learning with Scikit Learn, Keras, and TensorFlow by Aurelien Geron - amzn.to/3gV8sO9
    Category 2 - Overall Data Science:
    The Art of Data Science By Roger D. Peng - amzn.to/2KD75aD
    Predictive Analytics By By Eric Siegel - amzn.to/3nsQftV
    Data Science for Business By Foster Provost - amzn.to/3ajN8QZ
    Category 3 - Statistics and Mathematics:
    Naked Statistics By Charles Wheelan - amzn.to/3gXLdmp
    Practical Statistics for Data Scientist By Peter Bruce - amzn.to/37wL9Y5
    Category 4 - Machine Learning:
    Introduction to machine learning by Andreas C Muller - amzn.to/3oZ3X7T
    The Hundred Page Machine Learning Book by Andriy Burkov - amzn.to/3pdqCxJ
    Category 5 - Programming:
    The Pragmatic Programmer by David Thomas - amzn.to/2WqWXVj
    Clean Code by Robert C. Martin - amzn.to/3oYOdlt
    My Studio Setup:
    My Camera: amzn.to/3mwXI9I
    My Mic: amzn.to/34phfD0
    My Tripod: amzn.to/3r4HeJA
    My Ring Light: amzn.to/3gZz00F
    Join the Facebook group :
    www.facebook.c...
    Follow on medium: / amanrai77
    Follow on quora: www.quora.com/...
    Follow on Twitter: @unfoldds
    Watch the Introduction to Data Science full playlist here: • Data Science In 15 Min...
    Watch python for data science playlist here:
    • Python Basics For Data...
    Watch the statistics and mathematics playlist here :
    • Measures of Central Te...
    Watch End to End Implementation of a simple machine-learning model in Python here:
    • How Does Machine Learn...
    Learn Ensemble Model, Bagging, and Boosting here:
    • Introduction to Ensemb...
    Build Career in Data Science Playlist:
    • Channel updates - Unfo...
    Artificial Neural Network and Deep Learning Playlist:
    • Intuition behind neura...
    Natural language Processing playlist:
    • Natural Language Proce...
    Understanding and building a recommendation system:
    • Recommendation System ...
    Access all my codes here:
    drive.google.c...
    Have a different question for me? Ask me here : docs.google.co...
    My Music: www.bensound.c...

Комментарии • 124

  • @rajendrabhise4427
    @rajendrabhise4427 9 месяцев назад +4

    This is the first video I encountered on LSTM subject and I dont think, I need to watch other videos to understand LSTM any more. What a clear and straight to the point lecture.

  • @deepakdodeja4663
    @deepakdodeja4663 Год назад +6

    This is a well-researched, well-designed, well-explained video I would say. Thanks a lot for efficiently explaining LSTM in just 30 minutes.

  • @msbrdmr
    @msbrdmr Год назад +3

    This was the most understanding tutorial I've ever seen about LSTM. Keep going. You are awesome.

  • @shanjidabduhalim6146
    @shanjidabduhalim6146 Год назад +3

    What a clear and straight to the point lecture. Thanks Prof!

  • @vetrivelmurugan4937
    @vetrivelmurugan4937 2 месяца назад +1

    Super, Excellent,I got Very clear idea about LSTM🎉😊

  • @youssefbechara8809
    @youssefbechara8809 5 месяцев назад

    I have to say mr.aman you've became my favourite deep learning teacher, your ability to teach HARD and complex maths in simple real-life examples is amazing! Not to mention the other youtubers teach us like robots they keep saying hard words, however you really make us feel like its a conversation!! It's really apparent how much professional and passionate you are about your work. Thank you so much❤!

  • @Mimimoon2024
    @Mimimoon2024 Год назад +1

    thanku prof, now am confidante enough about what happening in LSTM cells .its well clear even the math behind it very clear and well explained . thanku prof for this course.

  • @nagarajtrivedi610
    @nagarajtrivedi610 4 месяца назад

    Very well explained Aman. All these days I was not clear about how it retains information for short duration and long duration. I was also questioning myself how lstm predicts new words in a sequence. Today it has become clear to me. Thank you again.

  • @dheerajp3024
    @dheerajp3024 5 месяцев назад +1

    For the question asked at 18:38,
    The range of Sigmoid function is [0,1] and Tanh function is [-1,1].
    During Backpropagation, the partial derivative of Sigmoid function is much closer to zero when compared to the partial derivative of Tanh function. In longer-range networks the partial derivatives of Sigmoid function decreases to zero and can cause Vanishing gradients problem. But the partial derivative of Tanh is closer to one, hence the advantage and this solves the problem of Vanishing gradients. But we need to keep in mind that LSTM can still suffer from the problem of Exploding gradients, hence we use techniques like Gradient Clipping and Batch and Layer Normalization.
    I hope it answers the question.

  • @ShivaniChauhan-g8t
    @ShivaniChauhan-g8t 4 месяца назад

    thank you sir for this explanation of LSTM, made easy and understandable in few mins.

  • @renus4898
    @renus4898 7 месяцев назад

    Hai Aman, No words to say... Simply superb! Excellent topic selection, explanation, and presentation. Please continue your journey; it is so helpful.

  • @omarabubakarosman2791
    @omarabubakarosman2791 Год назад +1

    very explicit presentation. We are very grateful for breaking down this complex concept of deep learning.

  • @SelfBuiltWealth
    @SelfBuiltWealth 5 месяцев назад +1

    No bullshit this is the best explaination on youtube ❤❤pls keep helping us

  • @Okive-green
    @Okive-green Год назад +1

    thank you sir for this brief detailed video. it really helps me to get the idea about lstm

  • @barwalgayatri4655
    @barwalgayatri4655 Месяц назад

    Too Good Amamn very very easy to undersatnd

  • @vimalashekar-c8c
    @vimalashekar-c8c 4 месяца назад

    I like your style of teaching. Great job!

  • @houssam0017
    @houssam0017 2 месяца назад

    Thanks alot. Well explained

  • @samadhanpawar6554
    @samadhanpawar6554 Год назад +4

    Amazing explanation very clear to understand keep up the good work
    Make a video on GRU , Transformers, attention mechanism,encoder and decoder also😊

  • @Tatanajafi
    @Tatanajafi 10 месяцев назад

    Best description. Thank you.

  • @samiashah7914
    @samiashah7914 Год назад

    Zabardast, very well explained. ❤ Thank you Sir

  • @vimu-frm-slm
    @vimu-frm-slm 7 месяцев назад +1

    Hi Aman, Thanks for your video..I understand the Vanishing Gradient problem, where small gradients are back propagated to update the weights. If the gradient is small, the update to the weights will be even smaller, which will reduce the learning rate of the model and lead to poor performance.
    I also understand the LSTM model, which has long-term memory, short-term memory, a forget gate, an input gate, and an output gate. The thing I don’t understand is how the LSTM fixes the vanishing gradient problem during back propagation. The gradient can still be small even when using an LSTM, and when it is back propagated, the weight updates and learning rate will still be impacted.
    I understand how LSTM is used & helps in forward propagation.. How it helps in back propagation? Please make a video explaining that.. Your help is much appreciated.. Thanks again

  • @agbershimaemmanuel-ci6mz
    @agbershimaemmanuel-ci6mz 2 месяца назад

    Thank you Amar for this interesting lecture

  • @nimeshraijada5844
    @nimeshraijada5844 Год назад +2

    You have excellent teaching skills 👍

  • @revanthisbnimmagadda
    @revanthisbnimmagadda 5 месяцев назад

    Well explained Aman , like the video on LSTM

  • @swapnil2881
    @swapnil2881 Год назад

    Excellent I watch so many video but not clear concept but today video very helpful

  • @TheMato1112
    @TheMato1112 Год назад

    Thank you very much. Finally, I understand. Well Done

  • @rosie-5h
    @rosie-5h Год назад

    wow such a hard topic still felt really simple thank you sir for explaining it very nicely

  • @slickjunt
    @slickjunt 5 месяцев назад

    Keep up the great work !

  • @raheemasghar2383
    @raheemasghar2383 Год назад

    Great effort Aman

  • @sunilkumarsam150
    @sunilkumarsam150 Год назад +2

    Best explanation ever!👍

  • @PriyaBeram13
    @PriyaBeram13 7 месяцев назад

    Great Explanation Thank you sir! Could you tell me use of tanh in input gate?

  • @prateekbhadauria7004
    @prateekbhadauria7004 Год назад

    Nice and brief supereb explanation, thank you for spreading your knowledge.

  • @shine_through_darkness
    @shine_through_darkness 9 месяцев назад

    Thank you brother , your channel is really good

  • @NileshPatil-pl2fj
    @NileshPatil-pl2fj 6 месяцев назад

    I liked the video...very nice explanation

  • @veenajain
    @veenajain Год назад

    Awesome content Aman

  • @vincentdey4313
    @vincentdey4313 8 месяцев назад

    Well Explained . Great Job

  • @princekhunt1
    @princekhunt1 7 месяцев назад

    Nice explanation 👍

  • @zenithmacwan
    @zenithmacwan Год назад +1

    Excellent explanination! Thank you sir!

  • @TheSerbes
    @TheSerbes 4 месяца назад

    I want to make a parameter selection in lstm. I will remove unnecessary parameters. Do you have a video on how I can do this?

  • @divyaharshad9985
    @divyaharshad9985 11 месяцев назад

    v good content! explained lucidly

  • @Gezahegnt2000
    @Gezahegnt2000 8 месяцев назад +1

    Nice Tutorial thanks

  • @Hunger_Minds
    @Hunger_Minds 11 месяцев назад

    Aman Sir having a query regarding about LSTM architecture.
    In how many iteration will model to understand which one word is important & which one is not?..

  • @RafaelRivetti
    @RafaelRivetti 6 месяцев назад

    In the MLP network, data from independent variables from date t are used to predict a future value t+n. In the LSTM network, instead of using only data from time t of the independent variables, it uses data from time t, t-1, t-2, ..., t-n as desired by the programmer, and after that, generates the prediction for a future time t+n? Is this reasoning correct? Thank you very much!

  • @shubhrashrivastava9510
    @shubhrashrivastava9510 9 месяцев назад

    What is the difference between the final output ot and ht??? Explain in detail plz...thanks in advance

  • @preethibaligar6766
    @preethibaligar6766 7 месяцев назад

    And just like that magic of understanding happened!

  • @spoc.mnmjecspringboardmnmjec
    @spoc.mnmjecspringboardmnmjec 3 месяца назад

    Good sir

  • @danielfiadjoe9312
    @danielfiadjoe9312 Год назад

    Thank you. This is very clear.

  • @sunilkumarsam150
    @sunilkumarsam150 Год назад +3

    sir, can you make one video of the implementation and research paper writing effectively in the field of NLP, ML, and DL.

  • @Hiisbnxkskkwl
    @Hiisbnxkskkwl Месяц назад +1

    Sir badiya ekdum chutkiyo mae master Kara Diya 😂

  • @adityarajiv6346
    @adityarajiv6346 10 месяцев назад

    thanks it was really helpful!.

  • @KhairulMia-tr2jv
    @KhairulMia-tr2jv 5 месяцев назад

    good tutorial on LSTM which game me good idea.

  • @aiddee-p2n
    @aiddee-p2n Год назад

    Simply amazing

  • @kavyanagesh8304
    @kavyanagesh8304 Год назад

    You're THE BEST! Thank you.

  • @prateekkumar2740
    @prateekkumar2740 Год назад +2

    Great content Aman. Could you please consider time series problem solving using LSTM. Thanks

  • @MayankArya-jp7dq
    @MayankArya-jp7dq 7 месяцев назад

    Thanks Sir :)

  • @tehreemqasim2204
    @tehreemqasim2204 9 месяцев назад

    excellent tutorial thank you

  • @red_righthand-o4x
    @red_righthand-o4x 9 месяцев назад

    great brother

  • @aliyuabubakarmusa946
    @aliyuabubakarmusa946 8 месяцев назад

    The video is very interesting

  • @BhanusriAndhavarapu-xc1xb
    @BhanusriAndhavarapu-xc1xb 9 месяцев назад

    could you please explain timeseries data

  • @selvimurali-o2s
    @selvimurali-o2s Год назад

    Nice explanation.thank u

  • @casuresh7081
    @casuresh7081 Год назад

    Good Explanation. Thanks

  • @ashwinijalla3504
    @ashwinijalla3504 Год назад

    Please do series on complete learning of Generative AI concepts

  • @lathusree
    @lathusree Год назад

    Hi Aman can you help explain bus time arrival prediction using LSTM.

  • @malaykhare1006
    @malaykhare1006 4 месяца назад

    amazing

  • @AIandSunil
    @AIandSunil Год назад

    Excellent explanation sir❤ thank you

  • @akshitacharak4385
    @akshitacharak4385 Год назад

    Thnkyou so much👍🏼

  • @Amin-ue2dz
    @Amin-ue2dz 8 месяцев назад

    you are amazing.

  • @IT-pz8yz
    @IT-pz8yz Год назад

    Great explanation ,thank you

  • @dhirajpatil6776
    @dhirajpatil6776 9 месяцев назад

    Sir Please Can you provide written notes of this video means what you have explain in this video thats all I want written in words. Thank you

  • @milindankur
    @milindankur Год назад

    Great explanation! Thank you!

  • @muhammedthayyib9202
    @muhammedthayyib9202 Год назад

    GREAT work

  • @sg28011
    @sg28011 11 месяцев назад

    How will forget gate will know a particular word is irrelevant or of less importance?

    • @sg28011
      @sg28011 11 месяцев назад

      Got the answer as the video progressed

  • @parsasakhi1505
    @parsasakhi1505 Год назад

    awesome! Thank you

  • @afn8370
    @afn8370 Год назад

    thankyou man

  • @rahulraizada7001
    @rahulraizada7001 Год назад

    What is different between tanh and relu?

  • @chirumadderla8129
    @chirumadderla8129 Год назад

    can you please consider the eusecase of weather forecasting

  • @jawaherabdulwahabfadhil2329
    @jawaherabdulwahabfadhil2329 Год назад

    Thank u sir

  • @rahulraizada7001
    @rahulraizada7001 Год назад +1

    sigmoid lies between 0 and 1. whereas tanh lies between 1 and -1.

  • @spoc.mnmjecspringboardmnmjec
    @spoc.mnmjecspringboardmnmjec Год назад

    Nice sir

  • @tishachhabra3369
    @tishachhabra3369 Год назад

    Best👍

  • @chirumadderla8129
    @chirumadderla8129 Год назад

    Super stuff

  • @bakyt_yrysov
    @bakyt_yrysov 4 месяца назад

    🔥🔥🔥

  • @mohitchouksey6707
    @mohitchouksey6707 Год назад

    Sir how many people got job from ur course?

  • @hassanarshad2687
    @hassanarshad2687 Год назад

    🤯🤯🤯🤯🤯🤯

  • @Mimimoon2024
    @Mimimoon2024 Год назад

    u know prof i bought a bunch of courses related to this , believe me they were not clear as much as your course.

  • @KumR
    @KumR Год назад

    why sigma and why tanh?

  • @allinteli
    @allinteli 10 месяцев назад

    sir please hindi mein video banaoo

  • @siddheshmhatre2811
    @siddheshmhatre2811 Год назад

    Finally god found

  • @_Channel_X
    @_Channel_X 11 месяцев назад

    This is known as "simple explanation" in real sense