That's likely the #1 comment! The presenter writes on the glass and we flip the image in post-production. Martin is actually right-handed. Some married presenters will switch their wedding ring to the other hand or not wear it so as not to confuse their spouse.
@@abhishekrao6198 The presenter uses right hand to write. The camera is in front of him so that the original video should have all sentences reversed. Then flip the video left to right, you can get the video you watch now
Very useful and helpul. This is a hard topic to understand as readily way, but you can did it in just 8 minutes. Thanks for that Mr. Martin and company. Greetings from Mexico!
I'd appreciate a practical video. For instance a medical longitudinal study with N patients and several visits over some time, where they get a certain medication. What would the input need to look like? What is the process?
Tell me please, how is it regulated what we need to remember in sequence? How is it determined? That's how the model will determine what makes sense for the investigation?
Hi Martin as you given an example of martin and jenifer, When questions asked about jenifer, martin is no more reelevant so iit'll be forgotten right, if anytime in the future question asked about martin does that relevant to the LSTM? I mean will it be able to remind about martin even after forgetting it?
Is there a way to use clustering and similarity measures to load relevant context? Say we have a Text corpus, cluster it and the lstm checks what the current topic probably belongs too and loads information about this topic into the State or as an extra input?
Too many thanks, your lecture is very helpful, could you please explain all gates from LSTM (Forget Gate, Learn Gate, Remember Gate & Usegate(OutPut))?
Martin! I will tell you! Your 5 minutes explanation is worth an one hour reading! You are an amazing teacher!
I'm in love with his way of teaching!
Martin you are a wonderful teacher! Thank you very much for the explanations.
this is the greatest rnn and lstm explanation i have found
So are going to ignore the fact he wrote everything backwards on a clear glass wall?
That's likely the #1 comment! The presenter writes on the glass and we flip the image in post-production. Martin is actually right-handed. Some married presenters will switch their wedding ring to the other hand or not wear it so as not to confuse their spouse.
@@IBMTechnology Smart ! I was wondering how you did this.
@@IBMTechnology I don't understand how you guys made it look like he's writing backwards on a glass, could you explain it to me.
@@abhishekrao6198 The presenter uses right hand to write. The camera is in front of him so that the original video should have all sentences reversed. Then flip the video left to right, you can get the video you watch now
@@IBMTechnology 😂😂…the most obvious observation. People who didn’t even figure that out 🤦♂️should not be watching LSTM/RNN or other such videos 🕺
Very useful and helpul. This is a hard topic to understand as readily way, but you can did it in just 8 minutes. Thanks for that Mr. Martin and company. Greetings from Mexico!
After watching Martin Keen's explanations, you don't need another short explanation
Very helpful lecture. Keep up the good work!
Great advancement in time. Glad to have a better understanding. Thank you folks
Good lecture, Please make video on "transformer based models" too,
It will be very helpful
Thanks for the suggestion, Akash! We'll see what we can do! 🙂
I'd appreciate a practical video. For instance a medical longitudinal study with N patients and several visits over some time, where they get a certain medication. What would the input need to look like? What is the process?
Anyone else distracted by the fact that he's writing backwards? Great vids, keep it up
See ibm.biz/write-backwards
Tell me please, how is it regulated what we need to remember in sequence? How is it determined? That's how the model will determine what makes sense for the investigation?
Sir can you do a video of Rnn example by giving numerical values
The video is mirrored image of the actual video. The writing appears straight to the audience.
Fantastic explanation. Please keep making more.
Clear and concise explanation.
👍
mind blowing explanation . thanks
Hi Martin as you given an example of martin and jenifer, When questions asked about jenifer, martin is no more reelevant so iit'll be forgotten right, if anytime in the future question asked about martin does that relevant to the LSTM? I mean will it be able to remind about martin even after forgetting it?
Great video and nice visual effects!
thank you martin and team. great work.
Wait the Homebrew Challenge dude does Deep Learning too?!
LOL yes, Homebrew Challenge dude has a day job :)
Is there a way to use clustering and similarity measures to load relevant context? Say we have a Text corpus, cluster it and the lstm checks what the current topic probably belongs too and loads information about this topic into the State or as an extra input?
Too many thanks, your lecture is very helpful, could you please explain all gates from LSTM (Forget Gate, Learn Gate, Remember Gate & Usegate(OutPut))?
such an informative lecture, thank you so much
Very useful, plain, and concise 😀
Nice Explanation. Thank You.
Good lecture ! Thank you very much for the explanations.
Cam we use rnn for ctr prediction
please teach us more content like this..
How do we figure out relevance of state?
are there any new algorithms more powerful and more efficiency work than traditional neural network ?
Great video - do you write on glass and then transpose/flip the video?
See ibm.biz/write-backwards
Very good teaching, thank you!
Thanks for the clear explanation.
Is Lstm effective for stock price prediction?
Who can please tell me what screen he wrote on, it is so cool!🤩
See ibm.biz/write-backwards
tanks how you can apply lstm with times series?
How does this trick work, you record normally and then mirror the video?
Useful video. Thanks a lot
Imagine if the model takes the context "My name " and predicts "J" as the next letter 🤣
really good video, thanks you!
Really good video. Thank you!
Super good lecture
how can you write backwards?
Well he can not, if you look closely the video is flipped
what if Jennifer is they?
Thank you for the lecture
Great lecture, thank you.
You're welcome and thanks for watching, Shilpa!
How are you writing ?
anyone here to know the murderer, its the butler🤣
why echo though?
martin is buying apples. jennifer is making apple pie
That was great!
It's alwyas the gardener ... but very good video anyways :)
Thank you !
saved my exam
After first 10 seconds of this video
me: Woah now I know the origins of knives out movie😂
thank you
Thanks.
The voice is too low
Yes needs to work on audio 🔉
Subtitle is a solution
Use earphones. I did and had no problem!
Wao, Very helpful
Yeah It was the Buttler
5:00 5:42
who would have guess that some algorithm can beat current state of art memory (eg me )
I see the LSTM = low volume. This is the 3rd guide that is almost muted sound.
Sound is just fine mate
It was the butler.
awesome
Lopez Robert Perez Sarah Taylor Carol
"always butler" is high bias :D
magnificent
I am cooked
Brown Paul Martin Robert Jackson Michael
Lopez Sandra Martin Donna White Ronald
mirror writing🙃🙃🙃🙃🙃
Are u using mirror or can you actually write backwards sir 🤐🤐
❤❤
Brown Michael Robinson Barbara Anderson Donna
Seems a mixture of RAM & CACHE/BUFFER.
By 2050-60. Anything ever STORED on hardware will be recoverable.
By 2080 anything PROCESSED by hardware.
martin are you married or nah
Mmmmm. vaya vaya Mayordomo
habits
WHODUNNTI
VOICE too low!