Very useful and helpul. This is a hard topic to understand as readily way, but you can did it in just 8 minutes. Thanks for that Mr. Martin and company. Greetings from Mexico!
That's likely the #1 comment! The presenter writes on the glass and we flip the image in post-production. Martin is actually right-handed. Some married presenters will switch their wedding ring to the other hand or not wear it so as not to confuse their spouse.
@@abhishekrao6198 The presenter uses right hand to write. The camera is in front of him so that the original video should have all sentences reversed. Then flip the video left to right, you can get the video you watch now
I'd appreciate a practical video. For instance a medical longitudinal study with N patients and several visits over some time, where they get a certain medication. What would the input need to look like? What is the process?
Tell me please, how is it regulated what we need to remember in sequence? How is it determined? That's how the model will determine what makes sense for the investigation?
Too many thanks, your lecture is very helpful, could you please explain all gates from LSTM (Forget Gate, Learn Gate, Remember Gate & Usegate(OutPut))?
I'm wondering which type of listeners should be adressed to this explanation of LSTMs. I was expecting to get some technical details of LSTMs to have a deeper understanding why LSTMs work. Because this Video is provided by IBM i was expecting a high quality of presentation. May be it would be better to tell the audience what they can expect and what not from this.. to lower their expectations. I hope this comment helps.
Hi Martin as you given an example of martin and jenifer, When questions asked about jenifer, martin is no more reelevant so iit'll be forgotten right, if anytime in the future question asked about martin does that relevant to the LSTM? I mean will it be able to remind about martin even after forgetting it?
Is there a way to use clustering and similarity measures to load relevant context? Say we have a Text corpus, cluster it and the lstm checks what the current topic probably belongs too and loads information about this topic into the State or as an extra input?
Martin! I will tell you! Your 5 minutes explanation is worth an one hour reading! You are an amazing teacher!
I'm in love with his way of teaching!
Martin you are a wonderful teacher! Thank you very much for the explanations.
this is the greatest rnn and lstm explanation i have found
Very helpful lecture. Keep up the good work!
Great advancement in time. Glad to have a better understanding. Thank you folks
Very useful and helpul. This is a hard topic to understand as readily way, but you can did it in just 8 minutes. Thanks for that Mr. Martin and company. Greetings from Mexico!
So are going to ignore the fact he wrote everything backwards on a clear glass wall?
That's likely the #1 comment! The presenter writes on the glass and we flip the image in post-production. Martin is actually right-handed. Some married presenters will switch their wedding ring to the other hand or not wear it so as not to confuse their spouse.
@@IBMTechnology Smart ! I was wondering how you did this.
@@IBMTechnology I don't understand how you guys made it look like he's writing backwards on a glass, could you explain it to me.
@@abhishekrao6198 The presenter uses right hand to write. The camera is in front of him so that the original video should have all sentences reversed. Then flip the video left to right, you can get the video you watch now
@@IBMTechnology 😂😂…the most obvious observation. People who didn’t even figure that out 🤦♂️should not be watching LSTM/RNN or other such videos 🕺
After watching Martin Keen's explanations, you don't need another short explanation
Clear and concise explanation.
👍
Good lecture, Please make video on "transformer based models" too,
It will be very helpful
Thanks for the suggestion, Akash! We'll see what we can do! 🙂
Fantastic explanation. Please keep making more.
thank you martin and team. great work.
Great video and nice visual effects!
mind blowing explanation . thanks
such an informative lecture, thank you so much
awesome video, wondering how you can write mirrored)
Lol, I have the same question! And I just can't get past that watching this video.
Good lecture ! Thank you very much for the explanations.
Useful video. Thanks a lot
Thanks for the clear explanation.
I'd appreciate a practical video. For instance a medical longitudinal study with N patients and several visits over some time, where they get a certain medication. What would the input need to look like? What is the process?
Very good teaching, thank you!
Very useful, plain, and concise 😀
really good video, thanks you!
Really good video. Thank you!
Hey! Well explained 🎉❤
Anyone else distracted by the fact that he's writing backwards? Great vids, keep it up
See ibm.biz/write-backwards
Thank you for the lecture
Super good lecture
Nice Explanation. Thank You.
Wait the Homebrew Challenge dude does Deep Learning too?!
LOL yes, Homebrew Challenge dude has a day job :)
Great lecture, thank you.
You're welcome and thanks for watching, Shilpa!
Tell me please, how is it regulated what we need to remember in sequence? How is it determined? That's how the model will determine what makes sense for the investigation?
Sir can you do a video of Rnn example by giving numerical values
are there any new algorithms more powerful and more efficiency work than traditional neural network ?
Too many thanks, your lecture is very helpful, could you please explain all gates from LSTM (Forget Gate, Learn Gate, Remember Gate & Usegate(OutPut))?
I'm wondering which type of listeners should be adressed to this explanation of LSTMs. I was expecting to get some technical details of LSTMs to have a deeper understanding why LSTMs work. Because this Video is provided by IBM i was expecting a high quality of presentation. May be it would be better to tell the audience what they can expect and what not from this.. to lower their expectations. I hope this comment helps.
Thank you !
how can you write backwards?
Well he can not, if you look closely the video is flipped
please teach us more content like this..
Is Lstm effective for stock price prediction?
How does this trick work, you record normally and then mirror the video?
Hi Martin as you given an example of martin and jenifer, When questions asked about jenifer, martin is no more reelevant so iit'll be forgotten right, if anytime in the future question asked about martin does that relevant to the LSTM? I mean will it be able to remind about martin even after forgetting it?
Great video - do you write on glass and then transpose/flip the video?
See ibm.biz/write-backwards
Imagine if the model takes the context "My name " and predicts "J" as the next letter 🤣
How do we figure out relevance of state?
The video is mirrored image of the actual video. The writing appears straight to the audience.
thank you
Is there a way to use clustering and similarity measures to load relevant context? Say we have a Text corpus, cluster it and the lstm checks what the current topic probably belongs too and loads information about this topic into the State or as an extra input?
Who can please tell me what screen he wrote on, it is so cool!🤩
See ibm.biz/write-backwards
That was great!
Cam we use rnn for ctr prediction
tanks how you can apply lstm with times series?
Thanks.
why echo though?
How are you writing ?
Wao, Very helpful
awesome
After first 10 seconds of this video
me: Woah now I know the origins of knives out movie😂
martin is buying apples. jennifer is making apple pie
what if Jennifer is they?
The voice is too low
Yes needs to work on audio 🔉
Subtitle is a solution
Use earphones. I did and had no problem!
It's alwyas the gardener ... but very good video anyways :)
anyone here to know the murderer, its the butler🤣
magnificent
5:00 5:42
who would have guess that some algorithm can beat current state of art memory (eg me )
saved my exam
I see the LSTM = low volume. This is the 3rd guide that is almost muted sound.
Sound is just fine mate
❤❤
Yeah It was the Buttler
Lopez Robert Perez Sarah Taylor Carol
mirror writing🙃🙃🙃🙃🙃
"always butler" is high bias :D
It was the butler.
I am cooked
Lopez Sandra Martin Donna White Ronald
Are u using mirror or can you actually write backwards sir 🤐🤐
Brown Paul Martin Robert Jackson Michael
Brown Michael Robinson Barbara Anderson Donna
Mmmmm. vaya vaya Mayordomo
Seems a mixture of RAM & CACHE/BUFFER.
By 2050-60. Anything ever STORED on hardware will be recoverable.
By 2080 anything PROCESSED by hardware.
WHODUNNTI
martin are you married or nah
habits
VOICE too low!