What is LSTM (Long Short Term Memory)?
HTML-код
- Опубликовано: 16 май 2024
- Learn about watsonx → ibm.biz/BdvxRB
Long Short Term Memory, also known as LSTMs, are a special kind of Recurrent Neural Network, or RNN, architecture capable of learning long-term dependencies as well as a solution to the vanishing gradient problem that can occur when training traditional RNNs.
In this lightboard video, Martin Keen with IBM, breaks down why we need LSTMs to address the problem of long-term dependencies, how the cell state and its various gates help transfer relative information in a sequence chain, and a few key LSTM use cases.
#LSTM #RNN #AI - Наука
Martin you are a wonderful teacher! Thank you very much for the explanations.
Great advancement in time. Glad to have a better understanding. Thank you folks
Very helpful lecture. Keep up the good work!
Very useful and helpul. This is a hard topic to understand as readily way, but you can did it in just 8 minutes. Thanks for that Mr. Martin and company. Greetings from Mexico!
I'm in love with his way of teaching!
Clear and concise explanation.
👍
After watching Martin Keen's explanations, you don't need another short explanation
So are going to ignore the fact he wrote everything backwards on a clear glass wall?
That's likely the #1 comment! The presenter writes on the glass and we flip the image in post-production. Martin is actually right-handed. Some married presenters will switch their wedding ring to the other hand or not wear it so as not to confuse their spouse.
@@IBMTechnology Smart ! I was wondering how you did this.
@@IBMTechnology I don't understand how you guys made it look like he's writing backwards on a glass, could you explain it to me.
@@abhishekrao6198 The presenter uses right hand to write. The camera is in front of him so that the original video should have all sentences reversed. Then flip the video left to right, you can get the video you watch now
@@IBMTechnology 😂😂…the most obvious observation. People who didn’t even figure that out 🤦♂️should not be watching LSTM/RNN or other such videos 🕺
Fantastic explanation. Please keep making more.
Good lecture ! Thank you very much for the explanations.
thank you martin and team. great work.
such an informative lecture, thank you so much
Thanks for the clear explanation.
Great video and nice visual effects!
Too many thanks, your lecture is very helpful, could you please explain all gates from LSTM (Forget Gate, Learn Gate, Remember Gate & Usegate(OutPut))?
Very useful, plain, and concise 😀
Thank you for the lecture
Really good video. Thank you!
Super good lecture
Very good teaching, thank you!
Sir can you do a video of Rnn example by giving numerical values
Is there a way to use clustering and similarity measures to load relevant context? Say we have a Text corpus, cluster it and the lstm checks what the current topic probably belongs too and loads information about this topic into the State or as an extra input?
really good video, thanks you!
Good lecture, Please make video on "transformer based models" too,
It will be very helpful
Thanks for the suggestion, Akash! We'll see what we can do! 🙂
Useful video. Thanks a lot
Hi Martin as you given an example of martin and jenifer, When questions asked about jenifer, martin is no more reelevant so iit'll be forgotten right, if anytime in the future question asked about martin does that relevant to the LSTM? I mean will it be able to remind about martin even after forgetting it?
Thank you !
Great lecture, thank you.
You're welcome and thanks for watching, Shilpa!
That was great!
Tell me please, how is it regulated what we need to remember in sequence? How is it determined? That's how the model will determine what makes sense for the investigation?
thank you
Cam we use rnn for ctr prediction
Thanks.
tanks how you can apply lstm with times series?
please teach us more content like this..
Great video - do you write on glass and then transpose/flip the video?
See ibm.biz/write-backwards
Wao, Very helpful
how can you write backwards?
How does this trick work, you record normally and then mirror the video?
magnificent
awesome
The video is mirrored image of the actual video. The writing appears straight to the audience.
How are you writing ?
Anyone else distracted by the fact that he's writing backwards? Great vids, keep it up
See ibm.biz/write-backwards
Wait the Homebrew Challenge dude does Deep Learning too?!
LOL yes, Homebrew Challenge dude has a day job :)
❤❤
what if Jennifer is they?
Who can please tell me what screen he wrote on, it is so cool!🤩
See ibm.biz/write-backwards
5:00 5:42
why echo though?
Imagine if the model takes the context "My name " and predicts "J" as the next letter 🤣
After first 10 seconds of this video
me: Woah now I know the origins of knives out movie😂
saved my exam
who would have guess that some algorithm can beat current state of art memory (eg me )
"always butler" is high bias :D
The voice is too low
Yes needs to work on audio 🔉
Subtitle is a solution
Use earphones. I did and had no problem!
mirror writing🙃🙃🙃🙃🙃
I see the LSTM = low volume. This is the 3rd guide that is almost muted sound.
Sound is just fine mate
Are u using mirror or can you actually write backwards sir 🤐🤐
anyone here to know the murderer, its the butler🤣
VOICE too low!
martin are you married or nah