Attention Mechanism in 1 video | Seq2Seq Networks | Encoder Decoder Architecture

Поделиться
HTML-код
  • Опубликовано: 29 сен 2024

Комментарии • 111

  • @cool12345687
    @cool12345687 9 месяцев назад +32

    Have been learning from RUclips for quite some years, have never seen a teacher like you.. Hats off.

  • @vijaypalmanit
    @vijaypalmanit 9 месяцев назад

    First like then watch, because quality content is guaranteed

  • @somdubey5436
    @somdubey5436 8 месяцев назад +12

    Thanks!

  • @kalyanikadu8996
    @kalyanikadu8996 8 месяцев назад +9

    I am writing to kindly request if you could consider creating videos on topics such as BERT, and Distill BERT and Transformers etc as soon as possible because of the actively ongoing placement activities. Also guide us on how to use hugging face interface in context of different NLP usecases.. I understand that creating content requires time and effort, but I believe that your expertise would greatly enhance our understanding of these highly important and crucial topics. Thank you in advance and eagrly waiting for your future content.

    • @WIN_1306
      @WIN_1306 3 месяца назад

      hii maam,how and where are you giving placement interviews.I am junior and i am getting confused on how to apply for internships and all

  • @not_amanullah
    @not_amanullah 9 месяцев назад +1

    I didn't find better explanation Even in English

  • @Ocean_77848
    @Ocean_77848 9 месяцев назад +12

    Million dollars Lecture bro grab it.... clear concept in Hindi.... better lecture then the IIT'S and NIT'S Teacher

    • @WIN_1306
      @WIN_1306 3 месяца назад +1

      aslo iiits

  • @Watchtower-h4u
    @Watchtower-h4u 7 месяцев назад +1

    doubt : 31:39 instead of ANN why we are not cosine simlarity score as you said alpha is score value .

  • @Sandesh.Deshmukh
    @Sandesh.Deshmukh 9 месяцев назад +4

    I have tried multiple resources to learn this difficult concept...But the way you have explained is God level!!!!!!.... Thanks for efforts Nitish Sir ❤❤
    Super Excited for upcoming videos.

  • @NabidAlam360
    @NabidAlam360 8 месяцев назад +5

    He explains it so precisely! The good thing about his teaching is that he does not make the video short just to finish the topic, instead, he explains each thing with patience! Hats off!

  • @Shisuiii69
    @Shisuiii69 Месяц назад +1

    Free me aesi education aur kia chahiye zindgi me 🥳

  • @WIN_1306
    @WIN_1306 3 месяца назад +1

    i cant understand that now this video has 23k views but only 926 likes!!!!!!!!!!!!!
    like wtf

  • @haseebmohammed3728
    @haseebmohammed3728 9 месяцев назад +4

    This was just amazing sir. I have watched so many videos to understand the "C" value, no one give a clear explanation except you. Looking forward for next video from you.

  • @GauravKumar-dx5fp
    @GauravKumar-dx5fp 9 месяцев назад +1

    Sir when will the next video in this series come...?

  • @akshayshelke5779
    @akshayshelke5779 9 месяцев назад +1

    Please upload next video soon we are waiting for transformers and bert

  • @sakshipote20
    @sakshipote20 8 месяцев назад +3

    Hello, sir! Pls keep uploading this playlist. Eagerly waiting for the next video!!!

  • @Sam-gl1md
    @Sam-gl1md 9 месяцев назад +4

    Best explanation so far I have seen for Attention mechanism.
    Simple and easy to understand 👌👌👌👌👌👌👌👌👌👌

  • @utkarshtripathi9118
    @utkarshtripathi9118 9 месяцев назад +2

    Sir You are making ossm Videos with Excellent way of Teaching.

  • @prathamagarwal8582
    @prathamagarwal8582 Месяц назад

    Sir, you done a great job but after this type of explainations plz make a video in which code is done.....and make regular projects for each kind of mechanism for better understanding ...

  • @mrityunjaykumar2893
    @mrityunjaykumar2893 4 месяца назад

    Hi @campusX , Its great and clean explanation so far available on youtube, One confusion here, don"t you think--> to create Ci f(St-1,Hj), f= Neural Networ-->it will produce Eij, for all Eij, where i=1, j=1-4, will pass through softmax to generate the weights Aplha ij

  • @vishutanwar
    @vishutanwar 9 месяцев назад +2

    I am planning to start your course from 2jan2023,
    I just thought to check last video when you uploaded, this is just 10 days old, nice, I wish I will complete this course in 3-4 months,

  • @koushik7604
    @koushik7604 9 месяцев назад +3

    it's a gem, honestly.

  • @pyclassy
    @pyclassy 8 месяцев назад +1

    Hello Nitesh your explanations are really really awesome..I need to understand transformers so I am waiting for that.

  • @ParthivShah
    @ParthivShah 4 месяца назад +1

    Thank You Sir.

  • @mentalgaming2739
    @mentalgaming2739 7 месяцев назад +1

    Well Sir , As Always The Video is Soo Perfect , But Still i think , i Need More Practise on it . Thanks for the Giving the Best Version of Each Topic . Love from Pakistan . Sir Nitish. 😍

  • @shivampradhan6101
    @shivampradhan6101 8 месяцев назад +3

    first time in 2yrs of trying attention mechanism is clear to me now, thanks

  • @anupamgevariya341
    @anupamgevariya341 7 месяцев назад

    hii sir i have doubt like we are fedding input to our decoder as y1 and c1 so inside decoder it was rnn cell so how we can send two input os we are doing some operation before fed in. like dot prodeuct of y1*c1 so we get dimension back ??

  • @gourabguha3167
    @gourabguha3167 7 месяцев назад

    Sir MLOPs ka next videos pls upload karo, eagerly waiting Sir!!

  • @debojitmandal8670
    @debojitmandal8670 9 месяцев назад

    Where are you passing only h then your c is nothing but addition of all h multiplied by respective weights so for eg if h4 is a vector say [1,2,3,4] then your c is not h4 instead it's a different number c1 = weights*h1+weights2*h2+weights3*h3+weights4*h4 which will a completely different vector and not 1,2,3,4

  • @itsamankumar403
    @itsamankumar403 9 месяцев назад +1

    TYSM Sir, Please also post Transformer lecture

  • @PratyakshGautam-nc4mi
    @PratyakshGautam-nc4mi Месяц назад +1

    Best teacher I have ever seen in my life. This legend made a guy fall in love with Maths, who was scared of it during school. A BIG BIG BIG Thank you nitish sir, I have dream to meet you once in my life before i die ♥

  • @ayushmantomar2168
    @ayushmantomar2168 2 месяца назад

    sir great teaching, just if you would show some practicals

  • @bibhutibaibhavbora8770
    @bibhutibaibhavbora8770 9 месяцев назад +1

    Maza aa gaya❤

  • @not_amanullah
    @not_amanullah 8 месяцев назад +1

    Thanks

  • @data_scientist_harish
    @data_scientist_harish 9 месяцев назад +1

    Sir Llms pr playlists bnao please

  • @subhadwipmanna4130
    @subhadwipmanna4130 8 месяцев назад

    Thank sir, all clear, but one doubt only ,anyone pls answer. for i=2, there is one ANN or four ANN?....let say s1 = [1,2,3,4] and h1 = [5,6,7,8]..so to get alpha21, input in ANN will be [1,2,3,4,5,6,7,8] ?? or combination of all h ??

    • @neilansh
      @neilansh 8 месяцев назад

      I think, there will be single ANN for a particular Ci, and total there will be 4 ANN (for C1,C2,C3 and C4).
      Regarding calculation of alpha21 => You will provide s1 and h1 and alpha21 will be calculated. Similarly for calculating alpha22 you will provide s1 and h2 and so on...
      ( I can't guarantee that this explanation is correct)

  • @AbhishekM-h5e
    @AbhishekM-h5e 8 месяцев назад

    Hello, sir! Pls keep uploading this playlist. Eagerly waiting for the next video!!!

  • @SambitSatapathy-e4b
    @SambitSatapathy-e4b 7 месяцев назад +1

    THank you for such precise and clear explanation videos. 🙏🙏🙏🙏🙏🙏

  • @saikatdaw6909
    @saikatdaw6909 8 месяцев назад

    When can we expect the next video in the series....Transformers

  • @ranaasad6887
    @ranaasad6887 8 месяцев назад +1

    Hats off... such a clear and step by step approach. you connect concepts amazingly well. great teaching.

  • @muhammadikram375
    @muhammadikram375 9 месяцев назад

    sir please ap apni achievements or job k bary mai koi video banaen
    --- your job
    --- your monthly income (online earn by selling your skill, not from social media i.e. RUclips etc
    --- and many more etc

  • @KumR
    @KumR 8 месяцев назад

    Great one. Waiting for transformers. BTW can u also speak about tools like Llama,Langchain, RAG, LLM tuning , DPO, etc as well.... coming from the guru will help community

  • @HimanshuSharma-we5li
    @HimanshuSharma-we5li 7 месяцев назад

    Make some videos on who wants to start their ai/ml startups

  • @NarendraBME
    @NarendraBME 8 месяцев назад

    Hello sir, please cover the topics from GANs, diffusion models as well.

  • @majidhabibkhan625
    @majidhabibkhan625 6 месяцев назад

    sir i am waiting for Attention Mechanism in 2 video

  • @eyeonmystery
    @eyeonmystery 8 месяцев назад

    Sir, Kindly complete playlist please.

  • @eyeonmystery
    @eyeonmystery 8 месяцев назад

    Sir, Kindly complete playlist please.

  • @eyeonmystery
    @eyeonmystery 8 месяцев назад

    Sir, Kindly complete playlist please.

  • @eyeonmystery
    @eyeonmystery 8 месяцев назад

    Sir, Kindly complete playlist please.

  • @eyeonmystery
    @eyeonmystery 8 месяцев назад

    Sir, Kindly complete playlist please.

  • @eyeonmystery
    @eyeonmystery 8 месяцев назад

    Sir, Kindly complete playlist please.

  • @eyeonmystery
    @eyeonmystery 8 месяцев назад

    Sir, Kindly complete playlist please.

  • @eyeonmystery
    @eyeonmystery 8 месяцев назад

    Sir, Kindly complete playlist please.

  • @eyeonmystery
    @eyeonmystery 8 месяцев назад

    Sir, Kindly complete playlist please.

  • @eyeonmystery
    @eyeonmystery 8 месяцев назад

    Sir, Kindly complete playlist please.

  • @eyeonmystery
    @eyeonmystery 8 месяцев назад

    Sir, Kindly complete playlist please.

  • @eyeonmystery
    @eyeonmystery 8 месяцев назад

    Sir, Kindly complete playlist please.

  • @eyeonmystery
    @eyeonmystery 8 месяцев назад

    Sir, Kindly complete playlist please.

  • @eyeonmystery
    @eyeonmystery 8 месяцев назад

    Sir, Kindly complete playlist please.

  • @mohammadarif8057
    @mohammadarif8057 8 месяцев назад +1

    God Bless You ..Man what a session ..thanks a lot !!

  • @hetparekh1556
    @hetparekh1556 7 месяцев назад +1

    Amazing explanation of the topic

  • @naveenpoliasetty954
    @naveenpoliasetty954 9 месяцев назад

    Sir waiting for Transformers lecture

  • @teksinghayer5469
    @teksinghayer5469 8 месяцев назад

    when will video on transformer will come ?

  • @utkarshtripathi9118
    @utkarshtripathi9118 9 месяцев назад

    Sir Please Make videos on Generative AI and New AI Tools based on Generative AI

  • @himanshurathod4086
    @himanshurathod4086 9 месяцев назад

    nlp ka ner and topic modeling baki hai

  • @ranaasad6887
    @ranaasad6887 8 месяцев назад

    super excited to learn transformers😍😍

  • @dattatreyanh6121
    @dattatreyanh6121 6 месяцев назад

    Make video on point net architecture

  • @riyatiwari4767
    @riyatiwari4767 9 месяцев назад

    Thank u so much for this video. Please provide videos for transformer atleast by christmas. I have an interview aligned. Its a humble request

  • @riyatiwari4767
    @riyatiwari4767 9 месяцев назад

    You mentioned in NLP playlist that once deep learning will be covered you will conver Topic Modelling & NER as well.. Please conver both these topics to complete your NLP playlist

  • @lakshmims7590
    @lakshmims7590 6 месяцев назад

    Can make seperate video on LLM

  • @atrijpaul4009
    @atrijpaul4009 9 месяцев назад +1

    Much awaited

  • @sharangkulkarni1759
    @sharangkulkarni1759 3 месяца назад

    Ci lgega kidhr but?

  • @RashidAli-jh8zu
    @RashidAli-jh8zu 8 месяцев назад

    Thank you sir ❤

  • @kalyanikadu8996
    @kalyanikadu8996 8 месяцев назад

    Sir Please add next videos.

  • @aritradutta9538
    @aritradutta9538 6 месяцев назад

    what a fabulous explanation it was. Mind Blowing. Thanks a ton for explaining this much clear.

  • @keshavpoddar9288
    @keshavpoddar9288 9 месяцев назад

    can't wait for the Transformers video

  • @guptariya43
    @guptariya43 9 месяцев назад

    Eagerly waiting for Self Attention and Multi head attention videos now. ..

  • @sowmyak3326
    @sowmyak3326 5 месяцев назад

    I have no words to express how greatful I am. You really inspire me. I would recommend you to sell some goodies(t-shirts) so that we can purchase and show our love towards you....

  • @gulamkibria3769
    @gulamkibria3769 9 месяцев назад +1

    You are a great teacher....more video like this...please

  • @financegyaninstockmarket2960
    @financegyaninstockmarket2960 5 месяцев назад

    What a video ! 😀, maza aa gaya

  • @ariousvinx
    @ariousvinx 9 месяцев назад

    Thanks a lot for this amazing video! I always wait for your next of of this series ❤. Can you tell when the Transformers will start?

  • @shalinigarg8522
    @shalinigarg8522 9 месяцев назад

    Your all videos are exceptionally excellent and knowledgeable.... Please sir make videos on GNN's also with full detail and it's architecture and types..... It's humble request to you🙏

  • @technicalhouse9820
    @technicalhouse9820 7 месяцев назад +1

    Best Teacher Ever

  • @Amanullah-wy3ur
    @Amanullah-wy3ur 3 месяца назад

    this is helpful 🖤🤗

  • @eyeonmystery
    @eyeonmystery 8 месяцев назад

    Next vedio please sir

  • @shyamjain8379
    @shyamjain8379 9 месяцев назад +1

    looking to finish this bro nice work

  • @paritoshkumar7422
    @paritoshkumar7422 3 месяца назад

    Thank you SIr

  • @ALLABOUTUFC-ms8nt
    @ALLABOUTUFC-ms8nt 8 месяцев назад

    Love you Sir.

  • @eyeonmystery
    @eyeonmystery 8 месяцев назад

    Sir next vedio

  • @toxoreed4313
    @toxoreed4313 9 месяцев назад

    tytytyty

  • @aadityaadyotshrivastava2030
    @aadityaadyotshrivastava2030 7 месяцев назад

    One suggesion, please do not say in your video that this topic is tough to understand. The point here is that when you are teaching it doesn't seem to be tough to understand and secondly by saying "The topic is tough", physoclogically it impacts the learners...
    I have not found any tough video as of now. Your explanation is super!

  • @vatsalshingala3225
    @vatsalshingala3225 7 месяцев назад

    ❤❤❤❤❤❤❤❤❤❤

  • @not_amanullah
    @not_amanullah 7 месяцев назад

    Understood++

  • @not_amanullah
    @not_amanullah 8 месяцев назад

  • @nomannosher8928
    @nomannosher8928 6 месяцев назад

    I first like the video, then watch it because I know the quality of the lecture.

  • @sheetalsharma4366
    @sheetalsharma4366 4 месяца назад

    Amazing explanation. Thank you very much sir.

  • @mohammadiqbal-c1o
    @mohammadiqbal-c1o 9 месяцев назад

    tusi great ho sir

  • @ankitgupta1806
    @ankitgupta1806 6 месяцев назад

    thnk u

  • @Manishkumar-iw1cy
    @Manishkumar-iw1cy 9 месяцев назад

    Worth it

  • @shaktisd
    @shaktisd 9 месяцев назад

    Very good explanation, please make self attention video also

    • @shaktisd
      @shaktisd 9 месяцев назад

      @Shuraim843 really , that's a news . Do u mind explaining in detail what is your understanding of self attention and how llm use it to produce SOTA results ?

  • @abromioitis
    @abromioitis 9 месяцев назад

    Sir please bring the implementation video also for encoder decoder