Building the Gradient Descent Algorithm in 15 Minutes | Coding Challenge

Поделиться
HTML-код
  • Опубликовано: 8 июн 2024
  • What's happening guys, welcome to the second episode of CodeThat!
    In this ep I try to build a regression machine learning model using a gradient descent algorithm we build completely from scratch using Python. The only dependency we use in the challenge is numpy!
    Oh, and don't forget to connect with me!
    LinkedIn: bit.ly/324Epgo
    Facebook: bit.ly/3mB1sZD
    GitHub: bit.ly/3mDJllD
    Patreon: bit.ly/2OCn3UW
    Join the Discussion on Discord: bit.ly/3dQiZsV
    Happy coding!
    Nick
    P.s. Let me know how you go and drop a comment if you need a hand!
    #machinelearning #codingchallenge #gradientdescent
  • НаукаНаука

Комментарии • 95

  • @javierjdaza
    @javierjdaza Год назад +51

    i need to say this: you are the gamechanger here!!
    as a data scientist +2 years of experience, i ALWAYS learn something new with your content! please nich, never stop doing this things, and also, never cut your smile in your face, even if your are having bugs!!
    thanks for everything

    • @NicholasRenotte
      @NicholasRenotte  Год назад +4

      Thank you so much for your kind words @javierjdaza!

  • @lakshman587
    @lakshman587 Год назад +20

    Set the time limit to 20 mins from next time
    Because you are even explaining us.
    This is really awesome!!

    • @NicholasRenotte
      @NicholasRenotte  Год назад +2

      Thanks a million @Lakshman!! I try to keep it pretty tight so it’s a good challenge otherwise I know I’ll just talk for 22 minutes anyway😅

  • @Beowulf245
    @Beowulf245 Год назад

    I've been following your channel for a while now and I always find new cool stuff here. Keep up the good work, it's really helpful. Also, I love your positive personality, you really make complex stuff look entertaining.

  • @alyt9870
    @alyt9870 Год назад +7

    Love the channel Nicholas, have recently graduated from an NLP Master's degree and seeing you explain stuff in a simpler way and your coding challenges is really helping me connect with the material I've learned! Keep it up and I'll keep watching!

    • @NicholasRenotte
      @NicholasRenotte  Год назад

      Woah congrats @Ally 🎊 🎉 glad you’re enjoying the challenges, plenty more to come!!

  • @nikitaandriievskyi3448
    @nikitaandriievskyi3448 Год назад +39

    Once you initialized lr to 0.0, I knew you were going to forget to change it lol. Love the challenges tho, keep doing them, I think it would be cool to see how you implement a neural network from scratch

    • @NicholasRenotte
      @NicholasRenotte  Год назад +8

      I'm still kicking myself that it was the lr that tripped me up 😅, literally it's so different coding under pressure stuff that should just flow goes out the window. OHHH yeah, I thought about a good challenge building NNs while I was at the gym, stay tuned!

    • @thegeeksides
      @thegeeksides Год назад

      @@NicholasRenotte did you ever make that video? A NN from scratch for hadnwritten digits (MNIST) classification would be so awesome!

  • @spencerbertsch7375
    @spencerbertsch7375 Год назад +7

    Hey Nicholas! Love your channel and I'm really appreciating these 15 minute coding challenges - please keep it up! Also, you can disable those annoying VS Code popups you ran into at 8:35 by going to Code > Preferences > Settings, then typing "editor.hover.enable", then unchecking the "Editor > Hover" option. Hope that's useful!

    • @NicholasRenotte
      @NicholasRenotte  Год назад

      You are a lifesaver @Spencer, will do it next time i'm on the streaming rig!

  • @BOGABOOfull
    @BOGABOOfull Год назад +4

    Awesome video !! It's preety cool to see such theoretical concepts coded and explained like this. Keep going Nich !!

  • @Powercube7
    @Powercube7 Год назад +3

    the zoom in on the unsaved icon was personal 💀
    one of the reasons why I use autosave

    • @NicholasRenotte
      @NicholasRenotte  Год назад

      😅 I was angry at myself when editing, I had to make a point of it lol😂

  • @VictorGiustiniPerez_
    @VictorGiustiniPerez_ 11 месяцев назад

    Really nice video! Love the energy and the enthusiasm. Thanks for the help!

  • @einsteinsboi
    @einsteinsboi Год назад +2

    Amazing! I'm learning so much watching you code. Thank you for sharing.

  • @williamstephenjones3863
    @williamstephenjones3863 Год назад

    This is a very novel and cool way to teach coding. I really enjoyed it, and it was good to see you troubleshoot and get stuff wrong.

  • @cavaliereoscuro1098
    @cavaliereoscuro1098 Год назад

    the essence of Deep learning in a few lines of code... awesome

  • @ibrahim47x
    @ibrahim47x Год назад +4

    ChatGPT won this challenge instantaneously lol :
    import numpy as np
    # Set the learning rate
    learning_rate = 0.01
    # Set the number of iterations
    num_iterations = 1000
    # Define the data points
    X = np.array([[0, 1], [1, 0], [1, 1], [0, 0]])
    y = np.array([1, 1, 0, 0])
    # Initialize the weights
    weights = np.zeros(X.shape[1])
    # Train the model
    for i in range(num_iterations):
    # Compute the predicted values
    y_pred = 1 / (1 + np.exp(-1 * np.dot(X, weights)))

    # Compute the error
    error = y - y_pred

    # Update the weights
    weights += learning_rate * np.dot(X.T, error)
    # Print the weights
    print("Weights:", weights)
    A.I. description of the code: "This script defines a simple dataset with four data points and trains a model using the gradient descent algorithm to learn the weights that minimize the error between the predicted values and the true values. The model uses a sigmoid activation function to make predictions.
    The script initializes the weights to zeros, and then iteratively updates the weights using the gradient descent algorithm, computing the predicted values, the error, and the gradient of the error with respect to the weights. The learning rate determines the size of the step taken in each iteration.
    After training the model, the final weights are printed out. You can use these weights to make predictions on new data points by computing the dot product of the data points and the weights, and applying the sigmoid function."

  • @brunospfc8511
    @brunospfc8511 Год назад +7

    i'll give you half a win, since it was a small detail

  • @spicytuna08
    @spicytuna08 Год назад

    wow. you make the subject come alive with excitements and simplicity. you are really gifted. i will take you over hard to understand but smart Ph.D professors from Ivy league any day.

  • @leonardputtmann8404
    @leonardputtmann8404 Год назад +3

    This was oddly intense. Great job Nicholas! Even though you ran out of time, this video is still a win to me. 😉

    • @NicholasRenotte
      @NicholasRenotte  Год назад +2

      It definitely felt intense at the time Leonard 😅, the pressure is definitely real. I don't know what it is, but coding under pressure is just a completely different beast. Thanks a million, I'll take the win and thanks for checking it out!

  • @juliansteden2980
    @juliansteden2980 Год назад +4

    Great Video!
    Would be cool to come back to this and add visualization during gradient descend using matplotlib and show what is actually happening.
    For example drawing data points, regression line, individual loss between line and data points and showing stats like current step, w, b, total loss! :)

    • @NicholasRenotte
      @NicholasRenotte  Год назад +2

      OHHHH MANNN, I thought about doing that but I was debating whether I'd hit the 15 minute deadline already. Good suggestion @Julian!

  • @Felicia-126
    @Felicia-126 Год назад

    Amazing video!! Thank you so much

  • @akumlonglongkumer3824
    @akumlonglongkumer3824 Год назад

    Pretty impressive. This is awesome. Cheers

  • @darshitgoyani2094
    @darshitgoyani2094 Год назад

    Lots of Thanks, Nick :)

  • @luis96xd
    @luis96xd Год назад

    Great video, I like this kind of video where you code some AI task counterclock, you teach us the concepts and show us the reality of implementing it👏
    Well explained 😄👍

  • @kashishrajput4934
    @kashishrajput4934 Год назад +3

    That's so informative thank you so much

  • @rokunuzjahanrudro7571
    @rokunuzjahanrudro7571 7 месяцев назад

    Great video 🎉🎉

  • @sergioquijano7721
    @sergioquijano7721 Год назад +2

    You are so good at explaining these complicated concepts. Also, if you want to close the explore tab in VSCode try: Ctrl + b

    • @NicholasRenotte
      @NicholasRenotte  Год назад

      Legend, thanks a million @Sergio!!

    • @sergioquijano7721
      @sergioquijano7721 Год назад +1

      @@NicholasRenotte :D I can give you more shortcuts if you tell me where I can learn more about Machine Learning concepts as you explained

    • @NicholasRenotte
      @NicholasRenotte  Год назад +1

      @@sergioquijano7721 DONE, fair trade!! Been studying this book in a ton of depth this week: themlbook.com/ I threw my own spin on the grad descent example but the fundamentals are in there!

  • @lvjianlvj4604
    @lvjianlvj4604 Год назад

    I really like this video. It is great!

  • @GallantryX
    @GallantryX 9 месяцев назад

    Wow. This youtuber has only 197k. For this absolutely high-quality videos. you deserver more than 1m+, only thing to say, is keep grinding, and you'll get to it.

  • @adipurnomo5683
    @adipurnomo5683 Год назад

    Nice implementation bro

  • @_danfiz
    @_danfiz Год назад +1

    This is cool, seeing it realtime.

  • @alfathterry7215
    @alfathterry7215 2 месяца назад

    this is gold!

  • @dipendrathakuri6429
    @dipendrathakuri6429 Год назад

    I think you missed dividing the derivative by 2. Because in the formula for cost function, we have (1/2*no. of training data)*sum of squared error, when we take the derivative, 2 from dldw and 1/2 from cost function cancel each other. Anyway, it was a cool video, keep up the good work brother

  • @11harinair
    @11harinair Год назад

    Thanks for the video, subscribed! A suggestion : this small change to your code would demonstrate a real-world gradient descent solution for linear regression with noisy data. E.g. :
    x = np.random.randn(20,1)
    noise = np.random.randn(20,1)/10
    # w = 5.8, b = -231.9
    y = 5.8*x - 231.9 + noise

  • @SomebodythatIusetoknow123
    @SomebodythatIusetoknow123 2 месяца назад

    Thee learning raaate haha cool vid !

  • @birgenc5961
    @birgenc5961 Год назад

    Love it!

  • @majdabualnour
    @majdabualnour Год назад +2

    I realy love your vedio the idea of the vedio is insain and i realy like it

  • @grahamfernando8775
    @grahamfernando8775 Год назад +1

    Can you please do a tesorflow instance segmentation video using Mask RCNN. There isn't much of a RUclips content related to this online.

  • @abdulbary3668
    @abdulbary3668 Год назад +1

    You should create a model to Reduce the pressure during last minutes. Such that finding an optimal time tolerance (+-) ( 15+-b) 😂😂😂😂. 😢 but we need more videos like this to have good dataset 😂😂🎉. Thanks man

  • @kartik_exe_
    @kartik_exe_ 9 месяцев назад

    how amazing it is that he set timer for 15 mins and the vid is 22 mins long

  • @alexisjulianrojashuamani1582
    @alexisjulianrojashuamani1582 Год назад

    U R GOD MAN, so much thanks

  • @patrickm.39
    @patrickm.39 Год назад +1

    Are you reading my mind or something? Every time I'm stuck on a topic, you drop a video about it...

    • @NicholasRenotte
      @NicholasRenotte  Год назад

      Ayyyy, so glad you like it @Patrick. For the last two weeks I've just been making videos on stuff I find hard or want to get my head around I figure it's not just me staring there at some of these concepts like huh?!? Thanks for checking it out!!

  • @MSCAIMLRBRITHANYA
    @MSCAIMLRBRITHANYA Год назад

    oh god! you forgot to save and i involuntarily kept shouting SAVE IT! SAVE IT!

  • @terrencejeffersoncimafranc100
    @terrencejeffersoncimafranc100 Год назад

    Can you explain the notears algorithm? It would be a great help.

  • @jakekisiel7399
    @jakekisiel7399 Год назад

    Is there any other machine learning/NVIDIA Jetson video tutorials you would recommend?

  • @ShiftKoncepts
    @ShiftKoncepts 2 месяца назад

    Does gradient descent work for polynomial with multi-variable problems?

  • @aiforyounow
    @aiforyounow Год назад +1

    Nick but I thought there are existing algorithms that u can feed your data into ? I love the way you’re doing it though but is it good doing your style or used existing ones ??

    • @NicholasRenotte
      @NicholasRenotte  Год назад +2

      100% use the prebuilt ones in sklearn, this is more to understand how they work and to provide intuition for tuning and preprocessing!! Good question 👍

    • @aiforyounow
      @aiforyounow Год назад +1

      @@NicholasRenotte that’s why I call u Khalid of deep learning

  • @meguellatiyounes8659
    @meguellatiyounes8659 Год назад +1

    I wonder how much i takes the backpropagation algorithm ?

  • @msa7202
    @msa7202 Год назад

    Please do a video building a NN from scrath!!

  • @rrrfamilyrashriderockers6891
    @rrrfamilyrashriderockers6891 Год назад

    so can you please do this algorithm for multiple variables

  • @vialomur__vialomur5682
    @vialomur__vialomur5682 Год назад

    Thanks waiting for the part 5 forza

  • @Pedrommelos
    @Pedrommelos Год назад

    hey man! I have a friend from Lyon and you guys have the same surname, haha
    Any chance you have roots from there?

  • @ChangKaiHua300
    @ChangKaiHua300 Год назад +2

    Man you actually made it, unless you say tuning hyperparameter is part of the challenge lol

    • @NicholasRenotte
      @NicholasRenotte  Год назад +1

      You're my new best friend @Kai-Hua, I could've just wrote it off and said "So that's a regression model with gradient descent...and nooooowww, we'll tune it!"

  • @user-tx3mo1ez2n
    @user-tx3mo1ez2n Год назад

    why is it necessary for x and y to be list of lists ?

  • @adipurnomo5683
    @adipurnomo5683 Год назад

    Bro, how to implement gradient descent as weight in K nearest neighbor ?

  • @quadropheniaguy9811
    @quadropheniaguy9811 Год назад +2

    Could you please upload correct code to github? I lost track of your logic after "def descend () etc".

    • @NicholasRenotte
      @NicholasRenotte  Год назад +1

      Correct code is on there @Quadrophenia, not working?

  • @MrElectrecity
    @MrElectrecity Год назад +1

    Please check the Auto Save in file drop down list it's really time saver 😃
    I need to see the video many times to understand what are you doing
    But great work
    I love all what you do
    Thumb up 👍👍

  • @rverm1000
    @rverm1000 Год назад

    where is it used? why?

  • @user-uf3qh3fr7d
    @user-uf3qh3fr7d Год назад

    👍👍👍

  • @carlosvasquez-xp8ei
    @carlosvasquez-xp8ei Год назад

    Great video. Set time to 20 mins.

  • @lakshman587
    @lakshman587 Год назад +1

    Gift card not valid :(
    But it was fun!
    You are amazing!!

    • @NicholasRenotte
      @NicholasRenotte  Год назад

      Got claimed super fast this time @Lakshman!!

    • @lakshman587
      @lakshman587 Год назад +2

      @@NicholasRenotte My bad
      I have turned on the notification of your channel!
      Waiting for the next code that challenge!!!!
      Hope you win next time! 🤞🤞🤞

  • @philtoa334
    @philtoa334 Год назад +1

    Vert nice

    • @NicholasRenotte
      @NicholasRenotte  Год назад +2

      HEYYYYY PHIL!! Long time no see, thanks a mil!!

  • @nuke4496
    @nuke4496 7 месяцев назад

    LOLLLL

  • @nikaize
    @nikaize 2 месяца назад

    😂😂😂

  • @winglight2008
    @winglight2008 Год назад

    Where's my $50 gift card? Lol

  • @schubiduba1
    @schubiduba1 Год назад

    Was too fast for me

  • @Harshadswe123
    @Harshadswe123 Год назад

    I can do this more efficiently

  • @asoosroo989
    @asoosroo989 Год назад +2

    You can contact us on telegram