Recurrent Neural Network | Forward Propagation | Architecture

Поделиться
HTML-код
  • Опубликовано: 3 окт 2024

Комментарии • 130

  • @Sidcr07
    @Sidcr07 Год назад +97

    Sir plz dont leave this playlist incomplete, u can't even imagine how this playlist has become a blessing for many, it's my heartedly request, and i m sure I m not alone in requesting the same, there are many others also, plz do update this playlist also, I have given up many paid courses to study from your masterpiece videos, plzz sir it's a humble request 🙏🙏🙏🙏

    • @campusx-official
      @campusx-official  Год назад +44

      Nai chor rahe hai bhai. Was travelling this weekend so could not upload. Will upload the next video tomorrow.

    • @Sidcr07
      @Sidcr07 Год назад +1

      @@campusx-official thnks a lot sir, for everything..!

    • @smruti_06240
      @smruti_06240 Год назад

      @@campusx-official 🥲why r u not uploading next video ??

    • @geekyprogrammer4831
      @geekyprogrammer4831 Год назад +2

      I gave up on paid courses too to follow this channel.

    • @endoumamoru3835
      @endoumamoru3835 Год назад +2

      @@campusx-official Sir,baaki videos kb aaenge. Please Sir unhe bhi upload krden

  • @arunchechi1578
    @arunchechi1578 Год назад +3

    Sir, you are genuinely the best data science teacher ... I feel sooooo blessed

  • @anuradhabalasubramanian9845
    @anuradhabalasubramanian9845 Год назад +4

    No words to express your teaching style Sir. So simplified and in depth.Hats off to your knowledge and presentation. God Bless You !

  • @sayantanmanna7
    @sayantanmanna7 Месяц назад

    This is the best explanation video of RNN . Thank you sir.

  • @adityasrivastava78
    @adityasrivastava78 Год назад +2

    Beautiful explanation !!!! You are doing a great service to community !!!!

  • @HappyHumbleHopefulHelpKey
    @HappyHumbleHopefulHelpKey 2 месяца назад

    This is really a blessing love your videos sir and how simply and gracefully you explain these topics.

  • @chetankumarnaik9293
    @chetankumarnaik9293 Год назад +2

    You are just amazing. Please share video on LLM or make LLM's part of your mentorship program.

  • @aryangupta2051
    @aryangupta2051 2 месяца назад

    very simplified and logical lectures

  • @tusharsingh1915
    @tusharsingh1915 Год назад +1

    Thankyou bhaiya for your hardwork. This series means a lot to me as it was so helpful in learning deep algorithms and everything. The content you provide was pretty commedable. virtual hugs to you. Love from Ghaziabad🙂❤

  • @afmfarhad2129
    @afmfarhad2129 Год назад +1

    Amazing explanation 💯💯

  • @deepshikhaagarwal4125
    @deepshikhaagarwal4125 Год назад +1

    Sir it would be really helpful if you can provide remaining videos..after watching your videos we have become addictive to your videos and not comfortable with any other videos

  • @alkalinebase
    @alkalinebase Год назад +13

    It breaks my heart how much shadow banned you are. I literally have to search you by typing the entire thing because this doesn't show up in the first 20 videos or so. You are the best teacher ever. I can't thank you enough sir!

  • @rashidsiddiqui4502
    @rashidsiddiqui4502 Год назад +2

    Thank u so much sir.

  • @Naman_Bansal102
    @Naman_Bansal102 3 месяца назад

    Great Video Sir

  • @sahuchiragshyamlal3684
    @sahuchiragshyamlal3684 Год назад +1

    around 36, weight sharing, input dimension in keras, can store sequence of 10 time steps

  • @lifeshortsyt786
    @lifeshortsyt786 2 месяца назад

    Amazing explanation i wish I could have the notebook

  • @VineetSinghChauhan-x1n
    @VineetSinghChauhan-x1n 6 месяцев назад +10

    I have never ever commented on any video. Simplifying such concepts, you are the best teacher I have ever seen in my exploration of Data Science.

  • @rb4754
    @rb4754 3 месяца назад

    Super Amazing video...

  • @geekyprogrammer4831
    @geekyprogrammer4831 Год назад +1

    Hi Nitish. Amazing explaination! Breaking down such complex topic into simple bits! But please do upload backpropogation too for RNN.

  • @aiforeveryone
    @aiforeveryone Год назад +1

    Amazing

  • @ishujain9343
    @ishujain9343 5 месяцев назад +5

    I have studied RNN from many other teachers, but after seeing this video, I really understand how it works. I now have the confidence to explain this topic to anyone. Thank you very much, sir.

  • @rvdjc
    @rvdjc Год назад +3

    Thank you for very clear explanation! Could you please tell me the book you usually follow to understand concepts like RNN?

  • @ahmadtalhaansari4456
    @ahmadtalhaansari4456 Год назад +1

    Learning new concepts.
    August 13, 2023😊

  • @AdityaKumar-fq2lq
    @AdityaKumar-fq2lq Год назад +6

    Sir sare video dekh liye... Waoting for more video...plz sir jaldi jaldi bana digiye

  • @mohsinimam2048
    @mohsinimam2048 Год назад +1

    Thank you so much!

  • @geetakashyap411
    @geetakashyap411 5 месяцев назад

    Sir how u easily clarify all the difficult concepts

  • @rog0079
    @rog0079 Год назад +5

    Hi, first of all, great content! I was looking forward to this playlist but I see it hasn't been updated since 6 months, I was hoping to learn more about LSTMs, GRUs, Transformers, BERT, GPT, LLMs etc through this playlist, I hope you continue this series soon, thanks anyway :)

  • @sandipansarkar9211
    @sandipansarkar9211 Год назад +3

    where is the link for this google colab

  • @mohsinimam2048
    @mohsinimam2048 Год назад +1

    Please bring videos on LSTM and BERT

  • @rahuljha3686
    @rahuljha3686 9 месяцев назад +2

    input format in RNN -> (timesteps, input_features) .... when we send the first word is sent, time = 1
    "Movie was good" ... sent in as a tensor of (3, 5) where timesteps = 3 & input_features = 5
    unfolding through time -> every node has an activation function which by default is tan h
    output of one cycle is fed back to the nodes in the next cycle

  • @vishnusit1
    @vishnusit1 Год назад +2

    Thanks

  • @vishnusit1
    @vishnusit1 Год назад +1

    Hii Nitish, i appreciate the way you explain RNN , Most importanatly mathematical calculation..Can you suggest me book that explains deep learning in your style yes in your style..Bcoz i started reading many DL books but thier mathematical explanation was confusing and complex for me...Please respond

    • @campusx-official
      @campusx-official  Год назад

      May be Deep Learning with Python will suit your need

  • @rafibasha4145
    @rafibasha4145 Год назад +4

    Its been 8 days no video yet ,pls upload 2 videos per week and finish the playlist

    • @campusx-official
      @campusx-official  Год назад +5

      Upload the next video on Tuesday. Was travelling this week so could not shoot.

    • @rafibasha4145
      @rafibasha4145 Год назад

      @@campusx-official ,thanks bhai pls finish Feature selection ,xgboost catboost from ML ,DL and advance NLP along with python ..if possible staart MLOPs as well

  • @NabidAlam360
    @NabidAlam360 4 месяца назад

    So good! We want atleast 1M subscribers for you!

  • @divyanshtuli
    @divyanshtuli Год назад +2

    when will this playlist restart

  • @aryanchandra6298
    @aryanchandra6298 Год назад +3

    sir update this playlist, we are eagerly waiting for the completion of this playlist

  • @ali75988
    @ali75988 10 месяцев назад +1

    11:33 for anyone having problem that here input sizes are different. how would unequal be used as input. agley lecture main answer hai.

  • @soumilyade1057
    @soumilyade1057 Год назад +1

    I had already subscribed to the channel, but the video didn't appear in the search results 😑 because it's not in English? May be!

  • @curiousseeker3784
    @curiousseeker3784 Год назад +1

    you said something around 39:40 that final output layer (activated by softmax/ sigmoid) is used only for last timestamp, ig it's only for this RNN (sentiment analysis) , otherwise for each recurrent block, we may have used final output layers too alongwith hidden layers?

  • @HerambYt
    @HerambYt 9 месяцев назад +1

    At t=2 why the input is connected only to 1 hidden layer it should be connected to all(3) hidden layer

  • @muhammadishaq-co4fu
    @muhammadishaq-co4fu Год назад +1

    Sir, I hope you are feeling good.
    Sir please tell me how to have question answer with you
    plzzzzzz

  • @yashjain6372
    @yashjain6372 Год назад +1

    best

  • @yashjain6372
    @yashjain6372 Год назад +1

    best

  • @saurabhsingh1794
    @saurabhsingh1794 Год назад +1

    Hi Nitish, Cab you upload videos on encoders, decoders, attention models and Transformers. btw your explanation is great. Good work .

  • @messi0510
    @messi0510 Год назад +1

    11:35-->14:35 2 main difference between RNN and ANN

  • @riyatiwari4767
    @riyatiwari4767 10 месяцев назад +2

    This playlist is amazing! One of the best for understanding deep learning. Thanks a lot! 🎉

  • @dattatreyanh6121
    @dattatreyanh6121 9 месяцев назад +1

    Can u make a playlist cnn with lstm block

  • @jooeeemusic7963
    @jooeeemusic7963 Год назад +1

    Sir please continue This playlist sir. Pleasee

  • @ritesh_b
    @ritesh_b Год назад +2

    please upload xgboost video sir

  • @suhaspatil8682
    @suhaspatil8682 Год назад +1

    Sir How did you find the vectors like for movie 10000 ,for was 01000 etc etc

  • @SanhitaSaxena
    @SanhitaSaxena 4 месяца назад +1

    This is hands down the best deep learning playlists out there, i am blindly following this because i just know sir will surely be covering all the content in depth and i don't have to go look for it anywhere else.

  • @MurariMahaseth
    @MurariMahaseth Год назад +1

    Thank You Sir, can you please suggest book for more information ?

  • @nallakrishna8796
    @nallakrishna8796 Год назад +1

    Sir, the way your explaining the things it's simply amazing I'm eagerly searching for learning of RNN MATHEMATICAL INTUITION finally when I watch your lecture I stopped searching for RNN .. you made the concept step by step more understandable.... please continuously upload these kind of concepts which will helpful for the upcoming students then can learn from you... sir it's my humble request......

  • @ram_c
    @ram_c Год назад +1

    Please make tutorials on PINN. Thank you

  • @AbcdAbcd-ol5hn
    @AbcdAbcd-ol5hn Год назад +1

    Sir please please complete this playlist, thank you so much for the videos uploaded till now...

  • @debojitmandal8670
    @debojitmandal8670 Год назад +1

    Sir still I don't understand one thing when you explained the working this working still doesn't explain how your able to retain the context or semantic meaning it just tells how your able to maintain the order of words ime which word comes before which word but not it's importance of each word how much does each word has an impact on other words

    • @priyadarshichatterjee7933
      @priyadarshichatterjee7933 Год назад

      semantics aise retain ho rahe ki we are following this feedback approach har time step me unrolling k waqt we are using the output of the previous step as the input to the next step, ANN k ki tarah sare i/p eksath nhi jaa rahe network me, so agr hum input k order change karenge 1st step me so corresponding O1, O2, O3 sare hi change ho jayenge..... thus maintaining the semantics of the orginial word u input

  • @piyushpathak7311
    @piyushpathak7311 Год назад +2

    Pls upload next video sir

  • @murtuzakhan3655
    @murtuzakhan3655 Год назад +1

    😭😭😭 how many thanks should we say to you ❤️❤️❤️❤️

  • @rafibasha4145
    @rafibasha4145 Год назад +1

    @17:37,please let me know why one neuron output should go to other neuron in same layer

  • @ParthivShah
    @ParthivShah 5 месяцев назад +1

    Thank You Sir.

  • @parthdhore3757
    @parthdhore3757 Год назад +1

    Could u plz update the ML Roadmap 2022 with the CNN and RNN videos

  • @debojitmandal8670
    @debojitmandal8670 Год назад +1

    hi sir when will u upload videos on transfromers

  • @aritradutta9538
    @aritradutta9538 7 месяцев назад +1

    best ever explanation of RNN architecture. Thanks a ton

  • @74kumarrohit
    @74kumarrohit 8 месяцев назад +1

    Simply the best. I have been recommending this channel to all

  • @shubhamSharma-gw1oe
    @shubhamSharma-gw1oe Год назад +1

    No words to express your teaching style Sir. So simplified and in depth.Hats off to your knowledge and presentation. God Bless You !

  • @sanchitdeepsingh9663
    @sanchitdeepsingh9663 10 месяцев назад +1

    thanks sir
    for simple explanation

  • @dakshbhatnagar
    @dakshbhatnagar Год назад +1

    Hey Bhai, The way you go into depth is great. Whatever deep learning I know is because of you only.
    Plus the hinglish hits different when learning new things. At least for me 😅

  • @rachitsingh4913
    @rachitsingh4913 9 месяцев назад +1

    The best❤

  • @prashantm9856
    @prashantm9856 9 месяцев назад +1

    Most beautiful explanation sir ❤❤❤❤

  • @EasySatistica
    @EasySatistica Год назад +1

    really great effort sir , please complete this playlist

  • @yashjain6372
    @yashjain6372 Год назад +1

    best

  • @AniketSharma-gq2by
    @AniketSharma-gq2by Год назад +1

    O complicated RNN, Bhaiya ji ne suru kar diya, itni khushi itni khushi🤡, time series 👀👀

  • @aadityaadyotshrivastava2030
    @aadityaadyotshrivastava2030 8 месяцев назад +1

    super

  • @mohdadil2875
    @mohdadil2875 Месяц назад +2

    is playlist ke complete notes h kisi ke paas ?

  • @sandipansarkar9211
    @sandipansarkar9211 Год назад +1

    finished watching and coding

  • @ROHITRAJ-c6g
    @ROHITRAJ-c6g Месяц назад

    so much humbled by your dedication sir. kudos for Great teaching!!

  • @pavangoyal6840
    @pavangoyal6840 10 месяцев назад +1

    wonderful

  • @vishnusit1
    @vishnusit1 Год назад +1

    great explanation..

  • @himanshuvora1980
    @himanshuvora1980 11 месяцев назад +1

    💚💚💚💚

  • @narendraparmar1631
    @narendraparmar1631 7 месяцев назад

    Good Explanation, But how the past information affect the future once

  • @abdulqadar9580
    @abdulqadar9580 Год назад +1

    Thank u Sir for your great efforts.

  • @navneetgupta4669
    @navneetgupta4669 Год назад +1

    Sir, when will we get the new video?

  • @flakky626
    @flakky626 Год назад +2

    sir literally crying while typing this have tears in my eyes
    thanks a lot sir

  • @stevesamson-p2k
    @stevesamson-p2k Год назад +1

    Awesome. Simple and detailed explanation of RNN.

  • @deepshikhaagarwal4125
    @deepshikhaagarwal4125 Год назад +1

    Nitesh sir amazing!!!!

  • @garimadhanania1853
    @garimadhanania1853 4 месяца назад

    god level content! simply amazing!

  • @khoja110
    @khoja110 Год назад +1

    so beautiful and stay blessed

  • @ali75988
    @ali75988 10 месяцев назад +1

    Badshaho, hats off to you. cha gae ho.

  • @anirudh6543
    @anirudh6543 6 месяцев назад

    Sir please provide these notes and code links.

  • @ItsPawan005
    @ItsPawan005 6 месяцев назад

    Superbbbbb Explaination!!!!
    No need to use Recurrence (repeat video) once intrepreted neately

  • @carti8778
    @carti8778 Год назад +1

    Thanks

  • @pranavreddy9218
    @pranavreddy9218 Месяц назад

    are the weights are same across all time steps?

  • @AdityaKumar-fq2lq
    @AdityaKumar-fq2lq Год назад +1

    Sir very good video

  • @AakashGoyal25
    @AakashGoyal25 2 месяца назад

    Best explanation ever :)

  • @siddheshmadkaikar1645
    @siddheshmadkaikar1645 5 месяцев назад

    Amazing explanation!

  • @VarunMalik-mo6mr
    @VarunMalik-mo6mr 4 месяца назад

    Thank you Nitish sir 💯

  • @ajitkumarpatel2048
    @ajitkumarpatel2048 Год назад +1

    🙏

  • @sunny739
    @sunny739 6 месяцев назад

    Sir your tutorials are like : watch it once and you are done with understanding concept.
    later all we need is a revision for couple of times and done with the topic.
    Thank you for everything 🙏

  • @ameyagurjar2576
    @ameyagurjar2576 6 месяцев назад

    ye aadmi legend hai ❤‍🔥❤‍🔥

  • @SanjayKumar-n5l1z
    @SanjayKumar-n5l1z 2 месяца назад

    This is exactly what I needed to understand RNN better. Thanks!