What is LSTM (Long Short Term Memory)?

Поделиться
HTML-код
  • Опубликовано: 16 май 2024
  • Learn about watsonx → ibm.biz/BdvxRB
    Long Short Term Memory, also known as LSTMs, are a special kind of Recurrent Neural Network, or RNN, architecture capable of learning long-term dependencies as well as a solution to the vanishing gradient problem that can occur when training traditional RNNs.
    In this lightboard video, Martin Keen with IBM, breaks down why we need LSTMs to address the problem of long-term dependencies, how the cell state and its various gates help transfer relative information in a sequence chain, and a few key LSTM use cases.
    #LSTM #RNN #AI
  • НаукаНаука

Комментарии • 81

  • @simonbax2002
    @simonbax2002 Год назад +28

    Martin you are a wonderful teacher! Thank you very much for the explanations.

  • @toenytv7946
    @toenytv7946 2 года назад +5

    Great advancement in time. Glad to have a better understanding. Thank you folks

  • @DieLazergurken
    @DieLazergurken 2 года назад +37

    Very helpful lecture. Keep up the good work!

  • @engr.inigoe.silvagalvan1161
    @engr.inigoe.silvagalvan1161 Год назад +7

    Very useful and helpul. This is a hard topic to understand as readily way, but you can did it in just 8 minutes. Thanks for that Mr. Martin and company. Greetings from Mexico!

  • @amalkumar256
    @amalkumar256 5 месяцев назад +3

    I'm in love with his way of teaching!

  • @fouziafathima6460
    @fouziafathima6460 9 месяцев назад +1

    Clear and concise explanation.
    👍

  • @IsxaaqAcademy
    @IsxaaqAcademy 8 дней назад +1

    After watching Martin Keen's explanations, you don't need another short explanation

  • @channel-xj7rp
    @channel-xj7rp 2 года назад +78

    So are going to ignore the fact he wrote everything backwards on a clear glass wall?

    • @IBMTechnology
      @IBMTechnology  2 года назад +103

      That's likely the #1 comment! The presenter writes on the glass and we flip the image in post-production. Martin is actually right-handed. Some married presenters will switch their wedding ring to the other hand or not wear it so as not to confuse their spouse.

    • @gehallak660
      @gehallak660 Год назад +6

      @@IBMTechnology Smart ! I was wondering how you did this.

    • @abhishekrao6198
      @abhishekrao6198 Год назад +5

      @@IBMTechnology I don't understand how you guys made it look like he's writing backwards on a glass, could you explain it to me.

    • @chaoma8228
      @chaoma8228 Год назад +4

      @@abhishekrao6198 The presenter uses right hand to write. The camera is in front of him so that the original video should have all sentences reversed. Then flip the video left to right, you can get the video you watch now

    • @nitinkapoor4752
      @nitinkapoor4752 Год назад +5

      @@IBMTechnology 😂😂…the most obvious observation. People who didn’t even figure that out 🤦‍♂️should not be watching LSTM/RNN or other such videos 🕺

  • @vivekpujaravp
    @vivekpujaravp Год назад

    Fantastic explanation. Please keep making more.

  • @WangY-ip3sb
    @WangY-ip3sb 10 месяцев назад

    Good lecture ! Thank you very much for the explanations.

  • @RishabKapadia
    @RishabKapadia Год назад

    thank you martin and team. great work.

  • @waleedt_trz
    @waleedt_trz 2 года назад +6

    such an informative lecture, thank you so much

  • @phonethiriyadana
    @phonethiriyadana 2 года назад

    Thanks for the clear explanation.

  • @theneumann7
    @theneumann7 Год назад

    Great video and nice visual effects!

  • @balenkamal182
    @balenkamal182 Год назад +1

    Too many thanks, your lecture is very helpful, could you please explain all gates from LSTM (Forget Gate, Learn Gate, Remember Gate & Usegate(OutPut))?

  • @athmaneghidouche7746
    @athmaneghidouche7746 Год назад

    Very useful, plain, and concise 😀

  • @mastajigga13
    @mastajigga13 2 года назад

    Thank you for the lecture

  • @tammofrancksen5186
    @tammofrancksen5186 2 года назад

    Really good video. Thank you!

  • @thomasplum9868
    @thomasplum9868 2 года назад +1

    Super good lecture

  • @Gabi_09
    @Gabi_09 Год назад

    Very good teaching, thank you!

  • @jayasreechaganti9382
    @jayasreechaganti9382 Год назад +1

    Sir can you do a video of Rnn example by giving numerical values

  • @massimothormann272
    @massimothormann272 Год назад

    Is there a way to use clustering and similarity measures to load relevant context? Say we have a Text corpus, cluster it and the lstm checks what the current topic probably belongs too and loads information about this topic into the State or as an extra input?

  • @siddhesh119369
    @siddhesh119369 Год назад

    really good video, thanks you!

  • @akashthoriya
    @akashthoriya 2 года назад +10

    Good lecture, Please make video on "transformer based models" too,
    It will be very helpful

    • @IBMTechnology
      @IBMTechnology  2 года назад +2

      Thanks for the suggestion, Akash! We'll see what we can do! 🙂

  • @kikichung2955
    @kikichung2955 Год назад

    Useful video. Thanks a lot

  • @user-wq2sn4zb1d
    @user-wq2sn4zb1d 6 месяцев назад

    Hi Martin as you given an example of martin and jenifer, When questions asked about jenifer, martin is no more reelevant so iit'll be forgotten right, if anytime in the future question asked about martin does that relevant to the LSTM? I mean will it be able to remind about martin even after forgetting it?

  • @kedarnandiwdekar20
    @kedarnandiwdekar20 Год назад

    Thank you !

  • @shilpadas1311
    @shilpadas1311 2 года назад

    Great lecture, thank you.

    • @IBMTechnology
      @IBMTechnology  2 года назад

      You're welcome and thanks for watching, Shilpa!

  • @mohammadrezarazavian9305
    @mohammadrezarazavian9305 Год назад

    That was great!

  • @bysedova
    @bysedova Год назад

    Tell me please, how is it regulated what we need to remember in sequence? How is it determined? That's how the model will determine what makes sense for the investigation?

  • @aimatters5600
    @aimatters5600 Год назад

    thank you

  • @LakshmiDevi_jul31
    @LakshmiDevi_jul31 10 месяцев назад

    Cam we use rnn for ctr prediction

  • @codeschool3964
    @codeschool3964 5 месяцев назад

    Thanks.

  • @maloukemallouke9735
    @maloukemallouke9735 6 месяцев назад

    tanks how you can apply lstm with times series?

  • @user-nz5tk9wm7f
    @user-nz5tk9wm7f 2 месяца назад

    please teach us more content like this..

  • @rollopost
    @rollopost 11 месяцев назад

    Great video - do you write on glass and then transpose/flip the video?

  • @colabwork1910
    @colabwork1910 Год назад

    Wao, Very helpful

  • @gupsekobas2209
    @gupsekobas2209 3 месяца назад +1

    how can you write backwards?

  • @Grzeroli1
    @Grzeroli1 Месяц назад

    How does this trick work, you record normally and then mirror the video?

  • @fatmamahmoud9148
    @fatmamahmoud9148 Год назад

    magnificent

  • @matthewg7702
    @matthewg7702 6 месяцев назад

    awesome

  • @trinity98767
    @trinity98767 2 месяца назад

    The video is mirrored image of the actual video. The writing appears straight to the audience.

  • @elevated_existence
    @elevated_existence Год назад

    How are you writing ?

  • @TaylorDeiaco
    @TaylorDeiaco 6 месяцев назад

    Anyone else distracted by the fact that he's writing backwards? Great vids, keep it up

  • @ashleygillman3104
    @ashleygillman3104 2 года назад +6

    Wait the Homebrew Challenge dude does Deep Learning too?!

  • @BinodLaKshitha
    @BinodLaKshitha 8 месяцев назад

    ❤❤

  • @ivant_true
    @ivant_true 9 месяцев назад

    what if Jennifer is they?

  • @wmxoae1237
    @wmxoae1237 Год назад

    Who can please tell me what screen he wrote on, it is so cool!🤩

  • @willw4096
    @willw4096 8 месяцев назад

    5:00 5:42

  • @anirvinkandarpa5544
    @anirvinkandarpa5544 23 дня назад

    why echo though?

  • @willdrunkenstein5367
    @willdrunkenstein5367 Месяц назад

    Imagine if the model takes the context "My name " and predicts "J" as the next letter 🤣

  • @abdullahshaikh7409
    @abdullahshaikh7409 Год назад

    After first 10 seconds of this video
    me: Woah now I know the origins of knives out movie😂

  • @IconOfSeas
    @IconOfSeas 10 месяцев назад

    saved my exam

  • @amirarshiamirzaei710
    @amirarshiamirzaei710 Месяц назад

    who would have guess that some algorithm can beat current state of art memory (eg me )

  • @eyupozturk8586
    @eyupozturk8586 Год назад

    "always butler" is high bias :D

  • @petchpaitoon
    @petchpaitoon 2 года назад +6

    The voice is too low

  • @simplexination9837
    @simplexination9837 Год назад

    mirror writing🙃🙃🙃🙃🙃

  • @cybrhckr
    @cybrhckr 2 года назад +3

    I see the LSTM = low volume. This is the 3rd guide that is almost muted sound.

    • @jeverydk
      @jeverydk Год назад +1

      Sound is just fine mate

  • @domnic7431
    @domnic7431 Год назад

    Are u using mirror or can you actually write backwards sir 🤐🤐

  • @user-fp8jx7gr7v
    @user-fp8jx7gr7v 5 месяцев назад +1

    anyone here to know the murderer, its the butler🤣

  • @manmohanmahapatra6040
    @manmohanmahapatra6040 11 дней назад

    VOICE too low!

  • @HITNUT
    @HITNUT Год назад

    martin are you married or nah