Gradient Descent For Neural Network | Deep Learning Tutorial 12 (Tensorflow2.0, Keras & Python)

Поделиться
HTML-код
  • Опубликовано: 26 окт 2024

Комментарии • 341

  • @codebasics
    @codebasics  2 года назад +12

    Check out our premium machine learning course with 2 Industry projects: codebasics.io/courses/machine-learning-for-data-science-beginners-to-advanced

    • @honeymilongton8401
      @honeymilongton8401 2 года назад

      Sir can you kindly please post the slides to the descripition that we can understand the vedios to clearly

    • @RAHUL-gf3ft
      @RAHUL-gf3ft 2 года назад

      @@honeymilongton8401 no

    • @fahadshaikh9099
      @fahadshaikh9099 Год назад

      @@narutoshippuden7024 i was looking for the same, did you figure out where to get that dataset from ?

  • @prashantmahajan8236
    @prashantmahajan8236 Год назад +33

    people are just wasting money in Online courses this man has gave you excellent tutorial.

  • @hetthummar9582
    @hetthummar9582 3 года назад +96

    One of the great playlist on deep learning.. You make it so simple

    • @codebasics
      @codebasics  3 года назад +8

      ☺️👍

    • @rensraks2007
      @rensraks2007 3 года назад +5

      Agreed 100%.. taught really well

    • @ywf98
      @ywf98 3 года назад +2

      yes make it so simple, thank a lot

    • @dudefromsa
      @dudefromsa 2 года назад +2

      Completely agree

  • @deepakavva317
    @deepakavva317 3 года назад +7

    i watched more than 30 videos on this concept... none of them filled confidence in me in this concept. Thanks sir...

    • @codebasics
      @codebasics  3 года назад +2

      Deepak you can do it. I wish you all the best 👍☺️

  • @SaiKrishna-ik1ud
    @SaiKrishna-ik1ud 3 года назад +12

    My First comment on you tube, what a great explanation. I spent more than one lakh to get certification on AIML. But ultimately learning concepts from you. your teaching is like where weight (knowledge) is improved but loss (fee) is zero.

  • @nayture_man
    @nayture_man 9 месяцев назад +1

    The best thing on RUclips for a data science learner is your channel. Thank you for making these videos which are like holy scriptures for learners out here. I am blown out by your level of simplicity in making these daunting topics appear so easy to understand. You never disappoint in helping us to grasp these concepts. Big thanks Dhaval.

  • @sambangichantibabu8590
    @sambangichantibabu8590 17 дней назад +1

    29:39 logloss function for each epoch(all rows at time) nice video

  • @tps8470
    @tps8470 7 месяцев назад +1

    This is one of the best explanation over gradient descent. Thanks a lot Sir

  • @aditipanigrahi3730
    @aditipanigrahi3730 3 года назад +10

    Totally deserve a lot of recognition since the concepts are explained so nicely and it gives a chance to code along too.

    • @aditipanigrahi3730
      @aditipanigrahi3730 2 года назад

      Hi @codebasics, where can I find the raw files that you are using? I am coding it alongside

  • @陳翰儒-d5m
    @陳翰儒-d5m 3 года назад +16

    Man, you just taught me so well.
    I have trained few models and try some machine learning projects before watching your videos.
    But the truth is I didn't really spend time on "really understanding" the math in machine learning.
    Today I finally really understand it and thank you very much .
    Great series, I promise that I will finish as much as I can about your machine learning videos.

    • @mazharbukhari786
      @mazharbukhari786 3 года назад +3

      For the matter of fact, the same comment would be of mine but 陳翰儒 陳翰儒 has mentioned before me.

  • @ahsanali5527
    @ahsanali5527 2 года назад +1

    yarrrr hats off to this man who gets into such details and implementations

  • @adityanarayangupta1
    @adityanarayangupta1 2 года назад +9

    Hi Dhaval, This is a great explanation, can't thank you enough for the effort you are putting in to help the data science community.

  • @jorgenb8464
    @jorgenb8464 3 года назад +1

    This summed a bachelors degree of knowledge, thx

  • @51vijaytakbhate31
    @51vijaytakbhate31 Год назад +1

    I never seen such simplicity of teaching like you Thanks 👍

  • @مهلامعظمیگودرزی

    you are the best teacher I have ever seen. Thank you very much

  • @vishaljaiswar6441
    @vishaljaiswar6441 2 года назад

    You have the best and intuitive explanation of concepts and code, EVER. Thanks!!

    • @codebasics
      @codebasics  2 года назад

      Glad it was helpful Vishal!

  • @abhi9029
    @abhi9029 Год назад

    superb explanation and practical session. Never seen anything like it on RUclips.

  • @SKCS-uy3sd
    @SKCS-uy3sd Год назад +2

    When I started deep learning I don't understand the logic behind the formula €(weights*input+bias)..thank you sir❤️❤️

  • @Pikachu-uk9zw
    @Pikachu-uk9zw 2 года назад

    I just want to say I really really like the expression on your face when explaining something. Anyway, thank you for make thing so simple for beginner

  • @sumit121285
    @sumit121285 3 года назад

    you are simply superb sir....
    there are so many who knows gradient descent , but very few of them like you can teach how gradient descent works... hats-off sir.. thankyou so much.....thanks alot....

  • @sakethamargani8846
    @sakethamargani8846 3 года назад +3

    Your teaching skills are awesome. Thank You for making these great tutorials

  • @GojoKamando
    @GojoKamando 10 месяцев назад

    best lecture i got in up till now in my data science course, tysm sir 💗

  • @americovaldazo6373
    @americovaldazo6373 3 года назад

    You've opened my mind with this video. Very helpful for deep learning understanding. Thank you from Argentina.

  • @Breaking_Bold
    @Breaking_Bold Год назад

    Excellent Teacher !!! Great scientist !!! Very nice video ...the best on yourtube related to machine learning

  • @Taher-p7i
    @Taher-p7i 4 месяца назад

    You make it quite easy to understand
    Hats off! for the Explanation ,
    Hats off! for the Effort ,
    Hats off! for the Content ,
    Thanks a lot Sir : )

  • @mohitagrawal4479
    @mohitagrawal4479 4 года назад +2

    The way of teaching is awesome. Great explanation. Your effort is appreciated

  • @BharatSingh-zk8lx
    @BharatSingh-zk8lx 2 года назад

    haha at first was not gonna watch coz it was 40 mins but u just made it so engaging. I loved how u explained and compared it in python.

  • @movie6819
    @movie6819 2 года назад

    I am from Bangladesh. Great explanation with combination of math & code. which I want really

  • @yogeshbharadwaj6200
    @yogeshbharadwaj6200 3 года назад +1

    Great Explanation....never though ill understand Gradient Descent so well and that too with code....you showed the very granular explanation of how we can write python code to get same output as model....that's awesome...you are increasing confidence in many ppl...great work.. Tks a lot...

  • @nachoeigu
    @nachoeigu 2 года назад

    It is the best course I have ever seen. Thank you very much!

  • @OceanAlves23
    @OceanAlves23 4 года назад +9

    Hi, great class. Congratulations from Brazil-Teresina-PI

    • @codebasics
      @codebasics  4 года назад +1

      glad you liked it Ocean

    • @SudhanshuKumar-lp1nr
      @SudhanshuKumar-lp1nr 3 года назад +1

      ​@@codebasics how did the tensor flow know that it has to stop at loss 0.4631

    • @subhamsekharpradhan297
      @subhamsekharpradhan297 3 года назад +1

      @@SudhanshuKumar-lp1nr that value is variable, if you increase epochs, then loss will dip even more

    • @SudhanshuKumar-lp1nr
      @SudhanshuKumar-lp1nr 3 года назад

      @@subhamsekharpradhan297 thanks

  • @dilipkumar2k6
    @dilipkumar2k6 2 года назад +2

    Loved the way you implemented gradient decent yourself, I started feeling ML :-)

  • @UpendraKumar-zc8lm
    @UpendraKumar-zc8lm 3 года назад

    The way of explaining is just awesome, I say. You are explaining topic like deep Learning in such a attractive way that I never get enough of watching your videos.
    Your presentations are superb, making everything easy for us to understand and retain for longer time.

    • @codebasics
      @codebasics  3 года назад +1

      Thanks for your kind words. If you found this to be useful, please share with your friends through linkedin, whatsapp or facebook that way this content can go to more people who are studying this topic.

    • @UpendraKumar-zc8lm
      @UpendraKumar-zc8lm 3 года назад

      @@codebasics sure sir,

  • @chetann5745
    @chetann5745 3 года назад

    wow Dhaval sir,
    you can prepare anybody to a potential data scientist

  • @procoder7099
    @procoder7099 3 года назад

    Thank you so much bhaiya for helping... I was feeling so lost before I stumbled onto your playlist

  • @David_Clement
    @David_Clement 10 дней назад

    Wow This was amazing
    Thank You Dhaval Patel ji 🙏❤

  • @udaysadhukhan1
    @udaysadhukhan1 4 года назад

    Hi , I am studying your tutorial series last 15 days. It is super and helping me lot to learning ML and DL.Thank you very much for this excellent kind of teaching and presentation with exercise. I believe your health is doing good. Pray to god for your very first recovery...

    • @codebasics
      @codebasics  4 года назад +1

      Hey Uday thanks for your prayers my friend. Yes my health is improving steadily 😊👍

  • @Rafian1924
    @Rafian1924 6 месяцев назад

    Genius trainer❤❤ You are gifted brother 😊

  • @shaiksuleman3191
    @shaiksuleman3191 4 года назад

    Simply Super B.No More Questions.Your videos are like medicine with no side effects

  • @its_kumar
    @its_kumar 4 года назад +1

    Very nice explanation.
    I was always confuse with gradient descent, but now I have understood well. 👍 👍
    Thanks

    • @codebasics
      @codebasics  4 года назад +1

      I am happy this was helpful to you.

    • @RAHULJAIN-ow7jl
      @RAHULJAIN-ow7jl 3 года назад

      @@codebasics please share dataset link

  • @dog-mn8ef
    @dog-mn8ef Год назад +1

    Excellent videos! They are very detailed. Thank you for sharing your knowledge.

  • @gursewak_007
    @gursewak_007 2 года назад

    Your detailed explanation helped me to understand how actually neural nets work. Thankyou for such a good content, it helped a lot.

  • @shreyapaurav5930
    @shreyapaurav5930 3 года назад

    Clear explanation. You made it so simple to grasp. Kudos

  • @arvindkumargupta8856
    @arvindkumargupta8856 3 года назад

    Excellent sir ..the way of your teaching is very good ..i got it easiily..thanku infinitive times sir

  • @techsavy5669
    @techsavy5669 3 года назад +1

    Nice, the initial iteration using one value at a time is very good. Loved it.

  • @sandiproy330
    @sandiproy330 Год назад

    Nice lucid explanation with simple codes. Great effort. Thank you so much.

  • @aaravjain3621
    @aaravjain3621 3 года назад

    Ur explanations are simple and indepth.. thank u

  • @brayanrai2880
    @brayanrai2880 25 дней назад

    crazy, Excelent playlist going ! I love it

  • @saurabhawankhede8561
    @saurabhawankhede8561 2 года назад

    sir, hats off, this is complex yet seems so simple as you tell. Thanks wont be enough.

  • @charmindesai3730
    @charmindesai3730 2 года назад

    Great explanation, A wise man once said, real intelligence is making difficult things simple. You are an example sir. thank you very much

  • @ilker.gungor
    @ilker.gungor 8 месяцев назад

    every single second is crucial of this video

  • @ankurchaudhary3415
    @ankurchaudhary3415 3 года назад

    Big fan of you, What a explaination man!!!!! kudos.. keep it up.

  • @majidaly7632
    @majidaly7632 3 года назад

    The way of teaching is awesome

    • @codebasics
      @codebasics  3 года назад

      I am happy this was helpful to you.

  • @azharjameel
    @azharjameel Год назад

    Thumbs up, you proved that neural network is not “Black Box”

  • @Shanvika0501
    @Shanvika0501 2 года назад

    excellent explanation., complex concept made so simple

  • @SukhwinderSinghchail3042
    @SukhwinderSinghchail3042 5 месяцев назад

    SIr.. Great tutorials ... I just hope you should also start giving certificates to people who complete watching these videos :)

  • @arjunreddy7203
    @arjunreddy7203 3 года назад

    Favorite videos for learning very complicated topic!

  • @kavyasharma4738
    @kavyasharma4738 3 года назад

    thank you for such a simple explanation of this gradient descent for neural network.

  • @yasamannazemi6706
    @yasamannazemi6706 3 года назад +7

    Actually, your course is one of the best deep learning tutorials ever. love it. How can I download the data insurance for training in my own jupyter notebook?

    • @lespri
      @lespri 10 месяцев назад

      from gilhub link in the description

  • @YadavAjay-f6u
    @YadavAjay-f6u Месяц назад

    Sir, your tutorial are too good.

  • @nilupulperera
    @nilupulperera Год назад

    Excellent. Simply awesome!

  • @krishvinraam4734
    @krishvinraam4734 2 года назад

    hello sir,
    This is Krishvin in 'def gradient_descent(age, affordability, y_true, epochs, loss_thresold):' the ' print (f'Epoch:{i}, w1:{w1}, w2:{w2}, bias:{bias}, loss:{loss}')' is given after the changing the w1,w2,b so for each epoch the next weights get printed.This is the reason why we are not able to see w1=1 w2=1 b=0 in the first.the weights are getting misplaced..Im a huge fan of ur tutorial. in ur tutorial complex things are taught in a simple way which makes me really curious to learn more.......Thank you,sir

  • @bipratim1
    @bipratim1 3 года назад

    Thank you. Seeing your video, even a 10th grade can understand Deep Learning

  • @dec13666
    @dec13666 3 года назад

    Nicely done! I am following your tutorials since the beginning, and they are fantastic! Eager to check your next video.

  • @electric_sand
    @electric_sand 4 года назад

    An awesome person you are...cheers from Nigeria

  • @mutalasuragemohammed6954
    @mutalasuragemohammed6954 Год назад

    Beautiful explanations.
    Thank you.

  • @amillion7582
    @amillion7582 2 года назад

    i just cannot thank you enough. Blessings

  • @patelaryan5434
    @patelaryan5434 4 года назад

    your explanation is very very nice sir. It help me a lot. Thank u so much sir.

  • @ahmedhesham3125
    @ahmedhesham3125 8 месяцев назад

    For real, this is great video and great effort.
    I have small question why you used np.transpose(age)
    age is one 1-D array so I think it's useless in this case.
    we can just do that:
    diff = y_predicted-y_true
    w1d = np.mean(age*diff)
    w2d = np.mean(affordability*diff)
    as we did with bias:
    bias_d = np.mean(y_predicted-y_true)
    Thank you very much.

  • @gurkanyesilyurt4461
    @gurkanyesilyurt4461 Год назад

    Thank you so much, I enjoy a lot with this.

  • @jayshreedonga2833
    @jayshreedonga2833 8 месяцев назад

    You are simply great. Thank you

  • @ajitkulkarni1702
    @ajitkulkarni1702 Год назад

    Great explaination Sir !

  • @AquarianVikas
    @AquarianVikas 2 года назад

    thank you for the video, it gave a deep level of insight into the topic as well as practice

  • @aniruddhapal1997
    @aniruddhapal1997 3 года назад

    Sir, You are doing a great job.....deep respect !!!

  • @benmoroney8629
    @benmoroney8629 3 года назад

    You're the greatest. Thank you for these.

  • @ousmanealamakaba3135
    @ousmanealamakaba3135 2 года назад

    perfect .you are genus sir

  • @shahbaz_shaikh_vip100
    @shahbaz_shaikh_vip100 Год назад

    Superb explanation 👌🏻👌🏻👌🏻

  • @mprasad3661
    @mprasad3661 2 года назад

    vere level explanation awesome bro

  • @sunilsharanappa7721
    @sunilsharanappa7721 3 года назад

    Superb explanation once again Thanks a lot.

  • @deepanshudutta4443
    @deepanshudutta4443 4 года назад +3

    Sir plz upload the dataset on Otherwise that video really helps me a lot.

  • @vaibhavverma2177
    @vaibhavverma2177 4 месяца назад

    🎯 Key points for quick navigation:
    00:00 *🧠 Introduction to Gradient Descent*
    - Understanding the importance of gradient descent in machine learning
    - Overview of the theory behind gradient descent
    - The relevance of gradient descent for data scientists and machine learning engineers
    03:00 *📊 Predictive Functions in Machine Learning*
    - Explanation of prediction functions in supervised learning
    - Definition and role of weights and biases in prediction functions
    - Importance of establishing a prediction function for supervised learning techniques
    05:43 *📈 Training a Neural Network with Gradient Descent*
    - Walkthrough of training a neural network using gradient descent
    - Explanation of forward pass and error calculation during training
    - Importance of adjusting weights and biases using derivative and learning rate for training optimization
    17:20 *💻 Implementing Gradient Descent in Python*
    - Demonstration of using TensorFlow to build a simple neural network for prediction
    - Explanation of model compilation and fitting in TensorFlow
    - Interpretation of model accuracy evaluation and prediction results
    26:22 *🧠 Overview of prediction function and calculation*
    - Explanation of the prediction function combining weights, age, affordability, and bias.
    29:00 *📉 Setting up helper methods and implementing gradient descent*
    - Introduction to helper methods like log loss and numpy sigmoid function.
    - Description of implementing the gradient descent function in Python from scratch with specified epochs, initialization, and learning rate.
    33:15 *🔄 Calculating derivatives for updating weights in gradient descent*
    - Demonstration of calculating derivatives for weights, affordability, and bias in the context of updating weights during gradient descent.
    Made with HARPA AI

  • @nirav_ai_ml
    @nirav_ai_ml Год назад

    Sir,
    Videos are good with all basic information.
    If possible can you make a combined video on maths concept related to AI-ML-DNN-NLP?

  • @bildadatsegha6923
    @bildadatsegha6923 11 месяцев назад +1

    Hi,
    I must admit that I really enjoyed this video. However, I have a question in the math analyses of error/Log loss (binary cross-entropy). If we have a predicted value of y_predict = 0.99 and an actual value of y_label = 1, applying the Log loss equation according to your video is = 0.01. My calculation applying the formula does not give me 0.01. Can you through light on this? Maybe my Math sucks 😃 or maybe you just used a random value for explanation purposes. Your response will help me assimilate the procedure better. Thanks again for your hardwork. I look forward to hearing from you.
    Regards

  • @kr.sheelvardhanbanty9136
    @kr.sheelvardhanbanty9136 9 месяцев назад

    This lecture was very depth and nicely explained. Just have one question, how to implement gradient descent with pytorch model or with tensorflow? here you built gradient descent from scratch but building this for almost 100 input is difficult so can we use some this with already built machine learning model?

  • @harshilrami5221
    @harshilrami5221 3 года назад

    Great explanation sir. very good content sir Thank you so much sir.

  • @know1374
    @know1374 4 месяца назад

    beautiful explanation

  • @avidreader100
    @avidreader100 Год назад

    4.31 is to be pronounced as four point three one. Not as four point thirty one. Just a suggestion for improvement. I am using this series to revise my knowledge acquired from another reputed source. So I can join many others in stating your style of teaching is very good .

    • @codebasics
      @codebasics  Год назад

      Thanks Sowmyan for the feedback 🙏🏼 I will keep this in mind and wish you all the best!

  • @kumudr
    @kumudr 3 года назад

    great video, lot of concept to go through, thanks a lot!

  • @piusranjan
    @piusranjan 3 месяца назад

    Very nice explanation . Do you have any real life example of Multi layer Neural Network ?

  • @يوزرسيف-ظ3ق
    @يوزرسيف-ظ3ق Год назад

    Thank u very much from all my heart ♥️

  • @sathiraful
    @sathiraful 3 года назад

    Wow great tutorial I have learned a lot.Thanks

  • @chocky_1874
    @chocky_1874 3 года назад

    Great Explanation. I love it. Thank You

  • @seahseowpeh8278
    @seahseowpeh8278 2 года назад

    Great job. Really useful

  • @rathishanand1
    @rathishanand1 2 года назад

    Hi,
    I need a small clarification, One of the sample data has the value Age 25, affordability 0, bought_insurance 1 (without affordability he bought the insurance. Its fine)
    But our model predicts the Age 26 , affordability 0 from X_test as 0 (means not bought insurance)..I am bit confused. Can you pls clarify...why both are behaving in exact opposite
    Meanwhile I am learning ML and AI from this channel. Your explaining its too good. Thanks for the dedication.

    • @codebasics
      @codebasics  2 года назад

      It is not necessary that it will always predict things in a right way. ML model generates a prediction function that can *generalize most of the cases* it has seen so far in training dataset. It is ok to have errors but the idea is on *majority* of the samples in test dataset it will predict things ok (this thing is measured by *accuracy* parameter). or model.score()

  • @gulafshabhatti9410
    @gulafshabhatti9410 3 года назад

    Amazing playlist sir.... I request you to teach how to implement levenberg marquardt using keras... Please sir..

  • @jeshanpokharel5482
    @jeshanpokharel5482 3 года назад

    very much well explained sir thank you so much

    • @codebasics
      @codebasics  3 года назад

      I am happy this was helpful to you.

  • @sanskar_choudhary
    @sanskar_choudhary 2 года назад

    40:28 Awesome🙌

  • @vivekmodi3165
    @vivekmodi3165 3 года назад

    It's simple. Thank you sir.

  • @sandipdeb6550
    @sandipdeb6550 Год назад

    Sir nicely explained

  • @ManethBalasooriya
    @ManethBalasooriya 11 месяцев назад +1

    Mix this series and Andrew Ngs course, J alamars blogs and brudda you are set for a ML job. Trust my word