I am writing to kindly request if you could consider creating videos on topics such as BERT, and Distill BERT and Transformers etc as soon as possible because of the actively ongoing placement activities. Also guide us on how to use hugging face interface in context of different NLP usecases.. I understand that creating content requires time and effort, but I believe that your expertise would greatly enhance our understanding of these highly important and crucial topics. Thank you in advance and eagrly waiting for your future content.
I have tried multiple resources to learn this difficult concept...But the way you have explained is God level!!!!!!.... Thanks for efforts Nitish Sir ❤❤ Super Excited for upcoming videos.
He explains it so precisely! The good thing about his teaching is that he does not make the video short just to finish the topic, instead, he explains each thing with patience! Hats off!
This was just amazing sir. I have watched so many videos to understand the "C" value, no one give a clear explanation except you. Looking forward for next video from you.
Sir, you done a great job but after this type of explainations plz make a video in which code is done.....and make regular projects for each kind of mechanism for better understanding ...
Hi @campusX , Its great and clean explanation so far available on youtube, One confusion here, don"t you think--> to create Ci f(St-1,Hj), f= Neural Networ-->it will produce Eij, for all Eij, where i=1, j=1-4, will pass through softmax to generate the weights Aplha ij
I am planning to start your course from 2jan2023, I just thought to check last video when you uploaded, this is just 10 days old, nice, I wish I will complete this course in 3-4 months,
Well Sir , As Always The Video is Soo Perfect , But Still i think , i Need More Practise on it . Thanks for the Giving the Best Version of Each Topic . Love from Pakistan . Sir Nitish. 😍
hii sir i have doubt like we are fedding input to our decoder as y1 and c1 so inside decoder it was rnn cell so how we can send two input os we are doing some operation before fed in. like dot prodeuct of y1*c1 so we get dimension back ??
Where are you passing only h then your c is nothing but addition of all h multiplied by respective weights so for eg if h4 is a vector say [1,2,3,4] then your c is not h4 instead it's a different number c1 = weights*h1+weights2*h2+weights3*h3+weights4*h4 which will a completely different vector and not 1,2,3,4
Best teacher I have ever seen in my life. This legend made a guy fall in love with Maths, who was scared of it during school. A BIG BIG BIG Thank you nitish sir, I have dream to meet you once in my life before i die ♥
Thank sir, all clear, but one doubt only ,anyone pls answer. for i=2, there is one ANN or four ANN?....let say s1 = [1,2,3,4] and h1 = [5,6,7,8]..so to get alpha21, input in ANN will be [1,2,3,4,5,6,7,8] ?? or combination of all h ??
I think, there will be single ANN for a particular Ci, and total there will be 4 ANN (for C1,C2,C3 and C4). Regarding calculation of alpha21 => You will provide s1 and h1 and alpha21 will be calculated. Similarly for calculating alpha22 you will provide s1 and h2 and so on... ( I can't guarantee that this explanation is correct)
sir please ap apni achievements or job k bary mai koi video banaen --- your job --- your monthly income (online earn by selling your skill, not from social media i.e. RUclips etc --- and many more etc
Great one. Waiting for transformers. BTW can u also speak about tools like Llama,Langchain, RAG, LLM tuning , DPO, etc as well.... coming from the guru will help community
You mentioned in NLP playlist that once deep learning will be covered you will conver Topic Modelling & NER as well.. Please conver both these topics to complete your NLP playlist
I have no words to express how greatful I am. You really inspire me. I would recommend you to sell some goodies(t-shirts) so that we can purchase and show our love towards you....
Your all videos are exceptionally excellent and knowledgeable.... Please sir make videos on GNN's also with full detail and it's architecture and types..... It's humble request to you🙏
One suggesion, please do not say in your video that this topic is tough to understand. The point here is that when you are teaching it doesn't seem to be tough to understand and secondly by saying "The topic is tough", physoclogically it impacts the learners... I have not found any tough video as of now. Your explanation is super!
@Shuraim843 really , that's a news . Do u mind explaining in detail what is your understanding of self attention and how llm use it to produce SOTA results ?
Have been learning from RUclips for quite some years, have never seen a teacher like you.. Hats off.
First like then watch, because quality content is guaranteed
Thanks!
I am writing to kindly request if you could consider creating videos on topics such as BERT, and Distill BERT and Transformers etc as soon as possible because of the actively ongoing placement activities. Also guide us on how to use hugging face interface in context of different NLP usecases.. I understand that creating content requires time and effort, but I believe that your expertise would greatly enhance our understanding of these highly important and crucial topics. Thank you in advance and eagrly waiting for your future content.
hii maam,how and where are you giving placement interviews.I am junior and i am getting confused on how to apply for internships and all
I didn't find better explanation Even in English
Million dollars Lecture bro grab it.... clear concept in Hindi.... better lecture then the IIT'S and NIT'S Teacher
aslo iiits
doubt : 31:39 instead of ANN why we are not cosine simlarity score as you said alpha is score value .
I have tried multiple resources to learn this difficult concept...But the way you have explained is God level!!!!!!.... Thanks for efforts Nitish Sir ❤❤
Super Excited for upcoming videos.
He explains it so precisely! The good thing about his teaching is that he does not make the video short just to finish the topic, instead, he explains each thing with patience! Hats off!
Free me aesi education aur kia chahiye zindgi me 🥳
i cant understand that now this video has 23k views but only 926 likes!!!!!!!!!!!!!
like wtf
Aap to kar do!
This was just amazing sir. I have watched so many videos to understand the "C" value, no one give a clear explanation except you. Looking forward for next video from you.
Sir when will the next video in this series come...?
Please upload next video soon we are waiting for transformers and bert
Hello, sir! Pls keep uploading this playlist. Eagerly waiting for the next video!!!
Best explanation so far I have seen for Attention mechanism.
Simple and easy to understand 👌👌👌👌👌👌👌👌👌👌
Sir You are making ossm Videos with Excellent way of Teaching.
Sir, you done a great job but after this type of explainations plz make a video in which code is done.....and make regular projects for each kind of mechanism for better understanding ...
Hi @campusX , Its great and clean explanation so far available on youtube, One confusion here, don"t you think--> to create Ci f(St-1,Hj), f= Neural Networ-->it will produce Eij, for all Eij, where i=1, j=1-4, will pass through softmax to generate the weights Aplha ij
I am planning to start your course from 2jan2023,
I just thought to check last video when you uploaded, this is just 10 days old, nice, I wish I will complete this course in 3-4 months,
it's a gem, honestly.
Hello Nitesh your explanations are really really awesome..I need to understand transformers so I am waiting for that.
Thank You Sir.
Well Sir , As Always The Video is Soo Perfect , But Still i think , i Need More Practise on it . Thanks for the Giving the Best Version of Each Topic . Love from Pakistan . Sir Nitish. 😍
first time in 2yrs of trying attention mechanism is clear to me now, thanks
hii sir i have doubt like we are fedding input to our decoder as y1 and c1 so inside decoder it was rnn cell so how we can send two input os we are doing some operation before fed in. like dot prodeuct of y1*c1 so we get dimension back ??
Sir MLOPs ka next videos pls upload karo, eagerly waiting Sir!!
Where are you passing only h then your c is nothing but addition of all h multiplied by respective weights so for eg if h4 is a vector say [1,2,3,4] then your c is not h4 instead it's a different number c1 = weights*h1+weights2*h2+weights3*h3+weights4*h4 which will a completely different vector and not 1,2,3,4
TYSM Sir, Please also post Transformer lecture
Best teacher I have ever seen in my life. This legend made a guy fall in love with Maths, who was scared of it during school. A BIG BIG BIG Thank you nitish sir, I have dream to meet you once in my life before i die ♥
sir great teaching, just if you would show some practicals
Maza aa gaya❤
Thanks
Sir Llms pr playlists bnao please
Thank sir, all clear, but one doubt only ,anyone pls answer. for i=2, there is one ANN or four ANN?....let say s1 = [1,2,3,4] and h1 = [5,6,7,8]..so to get alpha21, input in ANN will be [1,2,3,4,5,6,7,8] ?? or combination of all h ??
I think, there will be single ANN for a particular Ci, and total there will be 4 ANN (for C1,C2,C3 and C4).
Regarding calculation of alpha21 => You will provide s1 and h1 and alpha21 will be calculated. Similarly for calculating alpha22 you will provide s1 and h2 and so on...
( I can't guarantee that this explanation is correct)
Hello, sir! Pls keep uploading this playlist. Eagerly waiting for the next video!!!
THank you for such precise and clear explanation videos. 🙏🙏🙏🙏🙏🙏
When can we expect the next video in the series....Transformers
Hats off... such a clear and step by step approach. you connect concepts amazingly well. great teaching.
sir please ap apni achievements or job k bary mai koi video banaen
--- your job
--- your monthly income (online earn by selling your skill, not from social media i.e. RUclips etc
--- and many more etc
Great one. Waiting for transformers. BTW can u also speak about tools like Llama,Langchain, RAG, LLM tuning , DPO, etc as well.... coming from the guru will help community
Make some videos on who wants to start their ai/ml startups
Hello sir, please cover the topics from GANs, diffusion models as well.
he has a video on GAN.
sir i am waiting for Attention Mechanism in 2 video
Sir, Kindly complete playlist please.
Sir, Kindly complete playlist please.
Sir, Kindly complete playlist please.
Sir, Kindly complete playlist please.
Sir, Kindly complete playlist please.
Sir, Kindly complete playlist please.
Sir, Kindly complete playlist please.
Sir, Kindly complete playlist please.
Sir, Kindly complete playlist please.
Sir, Kindly complete playlist please.
Sir, Kindly complete playlist please.
Sir, Kindly complete playlist please.
Sir, Kindly complete playlist please.
Sir, Kindly complete playlist please.
God Bless You ..Man what a session ..thanks a lot !!
Amazing explanation of the topic
Sir waiting for Transformers lecture
when will video on transformer will come ?
Sir Please Make videos on Generative AI and New AI Tools based on Generative AI
nlp ka ner and topic modeling baki hai
super excited to learn transformers😍😍
Make video on point net architecture
Thank u so much for this video. Please provide videos for transformer atleast by christmas. I have an interview aligned. Its a humble request
You mentioned in NLP playlist that once deep learning will be covered you will conver Topic Modelling & NER as well.. Please conver both these topics to complete your NLP playlist
Can make seperate video on LLM
Much awaited
Ci lgega kidhr but?
Thank you sir ❤
Sir Please add next videos.
what a fabulous explanation it was. Mind Blowing. Thanks a ton for explaining this much clear.
can't wait for the Transformers video
Eagerly waiting for Self Attention and Multi head attention videos now. ..
I have no words to express how greatful I am. You really inspire me. I would recommend you to sell some goodies(t-shirts) so that we can purchase and show our love towards you....
You are a great teacher....more video like this...please
What a video ! 😀, maza aa gaya
Thanks a lot for this amazing video! I always wait for your next of of this series ❤. Can you tell when the Transformers will start?
Your all videos are exceptionally excellent and knowledgeable.... Please sir make videos on GNN's also with full detail and it's architecture and types..... It's humble request to you🙏
Best Teacher Ever
this is helpful 🖤🤗
Next vedio please sir
looking to finish this bro nice work
Thank you SIr
Love you Sir.
Sir next vedio
tytytyty
One suggesion, please do not say in your video that this topic is tough to understand. The point here is that when you are teaching it doesn't seem to be tough to understand and secondly by saying "The topic is tough", physoclogically it impacts the learners...
I have not found any tough video as of now. Your explanation is super!
❤❤❤❤❤❤❤❤❤❤
Understood++
❤
I first like the video, then watch it because I know the quality of the lecture.
Amazing explanation. Thank you very much sir.
tusi great ho sir
thnk u
Worth it
Very good explanation, please make self attention video also
@Shuraim843 really , that's a news . Do u mind explaining in detail what is your understanding of self attention and how llm use it to produce SOTA results ?
Sir please bring the implementation video also for encoder decoder