Building the Gradient Descent Algorithm in 15 Minutes | Coding Challenge

Поделиться
HTML-код
  • Опубликовано: 5 фев 2025
  • НаукаНаука

Комментарии • 100

  • @javierjdaza
    @javierjdaza 2 года назад +63

    i need to say this: you are the gamechanger here!!
    as a data scientist +2 years of experience, i ALWAYS learn something new with your content! please nich, never stop doing this things, and also, never cut your smile in your face, even if your are having bugs!!
    thanks for everything

    • @NicholasRenotte
      @NicholasRenotte  2 года назад +4

      Thank you so much for your kind words @javierjdaza!

  • @baleygrsteysionfayf9818
    @baleygrsteysionfayf9818 8 дней назад

    EXTREMELY educational.
    This should get like million views. It's a very good point for starter.

  • @lakshman587
    @lakshman587 2 года назад +25

    Set the time limit to 20 mins from next time
    Because you are even explaining us.
    This is really awesome!!

    • @NicholasRenotte
      @NicholasRenotte  2 года назад +5

      Thanks a million @Lakshman!! I try to keep it pretty tight so it’s a good challenge otherwise I know I’ll just talk for 22 minutes anyway😅

  • @alyt9870
    @alyt9870 2 года назад +8

    Love the channel Nicholas, have recently graduated from an NLP Master's degree and seeing you explain stuff in a simpler way and your coding challenges is really helping me connect with the material I've learned! Keep it up and I'll keep watching!

    • @NicholasRenotte
      @NicholasRenotte  2 года назад

      Woah congrats @Ally 🎊 🎉 glad you’re enjoying the challenges, plenty more to come!!

  • @nikitaandriievskyi3448
    @nikitaandriievskyi3448 2 года назад +47

    Once you initialized lr to 0.0, I knew you were going to forget to change it lol. Love the challenges tho, keep doing them, I think it would be cool to see how you implement a neural network from scratch

    • @NicholasRenotte
      @NicholasRenotte  2 года назад +11

      I'm still kicking myself that it was the lr that tripped me up 😅, literally it's so different coding under pressure stuff that should just flow goes out the window. OHHH yeah, I thought about a good challenge building NNs while I was at the gym, stay tuned!

    • @thegeeksides
      @thegeeksides Год назад

      @@NicholasRenotte did you ever make that video? A NN from scratch for hadnwritten digits (MNIST) classification would be so awesome!

  • @Powercube7
    @Powercube7 2 года назад +6

    the zoom in on the unsaved icon was personal 💀
    one of the reasons why I use autosave

    • @NicholasRenotte
      @NicholasRenotte  2 года назад

      😅 I was angry at myself when editing, I had to make a point of it lol😂

  • @spencerbertsch7375
    @spencerbertsch7375 2 года назад +9

    Hey Nicholas! Love your channel and I'm really appreciating these 15 minute coding challenges - please keep it up! Also, you can disable those annoying VS Code popups you ran into at 8:35 by going to Code > Preferences > Settings, then typing "editor.hover.enable", then unchecking the "Editor > Hover" option. Hope that's useful!

    • @NicholasRenotte
      @NicholasRenotte  2 года назад +1

      You are a lifesaver @Spencer, will do it next time i'm on the streaming rig!

  • @spicytuna08
    @spicytuna08 Год назад

    wow. you make the subject come alive with excitements and simplicity. you are really gifted. i will take you over hard to understand but smart Ph.D professors from Ivy league any day.

  • @MiguelNFer
    @MiguelNFer 2 года назад +4

    Awesome video !! It's preety cool to see such theoretical concepts coded and explained like this. Keep going Nich !!

  • @Beowulf245
    @Beowulf245 2 года назад

    I've been following your channel for a while now and I always find new cool stuff here. Keep up the good work, it's really helpful. Also, I love your positive personality, you really make complex stuff look entertaining.

  • @cavaliereoscuro1098
    @cavaliereoscuro1098 2 года назад

    the essence of Deep learning in a few lines of code... awesome

  • @williamstephenjones3863
    @williamstephenjones3863 Год назад

    This is a very novel and cool way to teach coding. I really enjoyed it, and it was good to see you troubleshoot and get stuff wrong.

  • @leonardputtmann8404
    @leonardputtmann8404 2 года назад +3

    This was oddly intense. Great job Nicholas! Even though you ran out of time, this video is still a win to me. 😉

    • @NicholasRenotte
      @NicholasRenotte  2 года назад +2

      It definitely felt intense at the time Leonard 😅, the pressure is definitely real. I don't know what it is, but coding under pressure is just a completely different beast. Thanks a million, I'll take the win and thanks for checking it out!

  • @Mohacks
    @Mohacks Год назад

    Wow. This youtuber has only 197k. For this absolutely high-quality videos. you deserver more than 1m+, only thing to say, is keep grinding, and you'll get to it.

  • @juliansteden2980
    @juliansteden2980 2 года назад +4

    Great Video!
    Would be cool to come back to this and add visualization during gradient descend using matplotlib and show what is actually happening.
    For example drawing data points, regression line, individual loss between line and data points and showing stats like current step, w, b, total loss! :)

    • @NicholasRenotte
      @NicholasRenotte  2 года назад +2

      OHHHH MANNN, I thought about doing that but I was debating whether I'd hit the 15 minute deadline already. Good suggestion @Julian!

  • @sergioquijano7721
    @sergioquijano7721 2 года назад +2

    You are so good at explaining these complicated concepts. Also, if you want to close the explore tab in VSCode try: Ctrl + b

    • @NicholasRenotte
      @NicholasRenotte  2 года назад

      Legend, thanks a million @Sergio!!

    • @sergioquijano7721
      @sergioquijano7721 2 года назад +1

      @@NicholasRenotte :D I can give you more shortcuts if you tell me where I can learn more about Machine Learning concepts as you explained

    • @NicholasRenotte
      @NicholasRenotte  2 года назад +1

      @@sergioquijano7721 DONE, fair trade!! Been studying this book in a ton of depth this week: themlbook.com/ I threw my own spin on the grad descent example but the fundamentals are in there!

  • @brunospfc8511
    @brunospfc8511 2 года назад +9

    i'll give you half a win, since it was a small detail

  • @einsteinsboi
    @einsteinsboi 2 года назад +2

    Amazing! I'm learning so much watching you code. Thank you for sharing.

  • @darshitgoyani2094
    @darshitgoyani2094 Год назад

    Lots of Thanks, Nick :)

  • @ibrahim47x
    @ibrahim47x 2 года назад +4

    ChatGPT won this challenge instantaneously lol :
    import numpy as np
    # Set the learning rate
    learning_rate = 0.01
    # Set the number of iterations
    num_iterations = 1000
    # Define the data points
    X = np.array([[0, 1], [1, 0], [1, 1], [0, 0]])
    y = np.array([1, 1, 0, 0])
    # Initialize the weights
    weights = np.zeros(X.shape[1])
    # Train the model
    for i in range(num_iterations):
    # Compute the predicted values
    y_pred = 1 / (1 + np.exp(-1 * np.dot(X, weights)))

    # Compute the error
    error = y - y_pred

    # Update the weights
    weights += learning_rate * np.dot(X.T, error)
    # Print the weights
    print("Weights:", weights)
    A.I. description of the code: "This script defines a simple dataset with four data points and trains a model using the gradient descent algorithm to learn the weights that minimize the error between the predicted values and the true values. The model uses a sigmoid activation function to make predictions.
    The script initializes the weights to zeros, and then iteratively updates the weights using the gradient descent algorithm, computing the predicted values, the error, and the gradient of the error with respect to the weights. The learning rate determines the size of the step taken in each iteration.
    After training the model, the final weights are printed out. You can use these weights to make predictions on new data points by computing the dot product of the data points and the weights, and applying the sigmoid function."

  • @VictorGiustiniPerez_
    @VictorGiustiniPerez_ Год назад

    Really nice video! Love the energy and the enthusiasm. Thanks for the help!

  • @dipendrathakuri6429
    @dipendrathakuri6429 Год назад

    I think you missed dividing the derivative by 2. Because in the formula for cost function, we have (1/2*no. of training data)*sum of squared error, when we take the derivative, 2 from dldw and 1/2 from cost function cancel each other. Anyway, it was a cool video, keep up the good work brother

  • @patrickm.39
    @patrickm.39 2 года назад +1

    Are you reading my mind or something? Every time I'm stuck on a topic, you drop a video about it...

    • @NicholasRenotte
      @NicholasRenotte  2 года назад

      Ayyyy, so glad you like it @Patrick. For the last two weeks I've just been making videos on stuff I find hard or want to get my head around I figure it's not just me staring there at some of these concepts like huh?!? Thanks for checking it out!!

  • @sana7388
    @sana7388 4 месяца назад

    Could you please provide the whole code, maybe in the description or else where? Thank you! Your videos are a life saver.

  • @MSCAIMLRBRITHANYA
    @MSCAIMLRBRITHANYA 2 года назад

    oh god! you forgot to save and i involuntarily kept shouting SAVE IT! SAVE IT!

  • @akumlonglongkumer3824
    @akumlonglongkumer3824 Год назад

    Pretty impressive. This is awesome. Cheers

  • @lvjianlvj4604
    @lvjianlvj4604 Год назад

    I really like this video. It is great!

  • @kashishrajput4934
    @kashishrajput4934 2 года назад +3

    That's so informative thank you so much

  • @_danfiz
    @_danfiz 2 года назад +1

    This is cool, seeing it realtime.

  • @luis96xd
    @luis96xd 2 года назад

    Great video, I like this kind of video where you code some AI task counterclock, you teach us the concepts and show us the reality of implementing it👏
    Well explained 😄👍

  • @kartik_exe_
    @kartik_exe_ Год назад

    how amazing it is that he set timer for 15 mins and the vid is 22 mins long

  • @abdulbary3668
    @abdulbary3668 Год назад +1

    You should create a model to Reduce the pressure during last minutes. Such that finding an optimal time tolerance (+-) ( 15+-b) 😂😂😂😂. 😢 but we need more videos like this to have good dataset 😂😂🎉. Thanks man

  • @majdabualnour
    @majdabualnour 2 года назад +2

    I realy love your vedio the idea of the vedio is insain and i realy like it

  • @alfathterry7215
    @alfathterry7215 10 месяцев назад

    this is gold!

  • @rokunuzjahanrudro7571
    @rokunuzjahanrudro7571 Год назад

    Great video 🎉🎉

  • @SomebodythatIusetoknow123
    @SomebodythatIusetoknow123 10 месяцев назад

    Thee learning raaate haha cool vid !

  • @11harinair
    @11harinair 2 года назад

    Thanks for the video, subscribed! A suggestion : this small change to your code would demonstrate a real-world gradient descent solution for linear regression with noisy data. E.g. :
    x = np.random.randn(20,1)
    noise = np.random.randn(20,1)/10
    # w = 5.8, b = -231.9
    y = 5.8*x - 231.9 + noise

  • @ChangKaiHua300
    @ChangKaiHua300 2 года назад +2

    Man you actually made it, unless you say tuning hyperparameter is part of the challenge lol

    • @NicholasRenotte
      @NicholasRenotte  2 года назад +1

      You're my new best friend @Kai-Hua, I could've just wrote it off and said "So that's a regression model with gradient descent...and nooooowww, we'll tune it!"

  • @adipurnomo5683
    @adipurnomo5683 2 года назад

    Nice implementation bro

  • @Felicia-126
    @Felicia-126 2 года назад

    Amazing video!! Thank you so much

  • @grahamfernando8775
    @grahamfernando8775 2 года назад +1

    Can you please do a tesorflow instance segmentation video using Mask RCNN. There isn't much of a RUclips content related to this online.

  • @alexisjulianrojashuamani1582
    @alexisjulianrojashuamani1582 Год назад

    U R GOD MAN, so much thanks

  • @aiforyounow
    @aiforyounow 2 года назад +1

    Nick but I thought there are existing algorithms that u can feed your data into ? I love the way you’re doing it though but is it good doing your style or used existing ones ??

    • @NicholasRenotte
      @NicholasRenotte  2 года назад +2

      100% use the prebuilt ones in sklearn, this is more to understand how they work and to provide intuition for tuning and preprocessing!! Good question 👍

    • @aiforyounow
      @aiforyounow 2 года назад +1

      @@NicholasRenotte that’s why I call u Khalid of deep learning

  • @birgenc5961
    @birgenc5961 2 года назад

    Love it!

  • @jakekisiel7399
    @jakekisiel7399 2 года назад

    Is there any other machine learning/NVIDIA Jetson video tutorials you would recommend?

  • @tomoki-v6o
    @tomoki-v6o 2 года назад +1

    I wonder how much i takes the backpropagation algorithm ?

  • @miraculousladynoir1023
    @miraculousladynoir1023 5 месяцев назад

    man i am new to this. Why are the updates not zero when learning rate is? What does a learning rate of 0 mean if it does not learn what is the purpose of building it?
    Edit: Nvm. Saw the rest of the video ,lol.

  • @विशालकुमार-छ7त

    why is it necessary for x and y to be list of lists ?

  • @terrencejeffersoncimafranc100
    @terrencejeffersoncimafranc100 2 года назад

    Can you explain the notears algorithm? It would be a great help.

  • @msa7202
    @msa7202 2 года назад

    Please do a video building a NN from scrath!!

  • @Pedrommelos
    @Pedrommelos 2 года назад

    hey man! I have a friend from Lyon and you guys have the same surname, haha
    Any chance you have roots from there?

  • @vialomur__vialomur5682
    @vialomur__vialomur5682 2 года назад

    Thanks waiting for the part 5 forza

  • @rrrfamilyrashriderockers6891
    @rrrfamilyrashriderockers6891 2 года назад

    so can you please do this algorithm for multiple variables

  • @adipurnomo5683
    @adipurnomo5683 2 года назад

    Bro, how to implement gradient descent as weight in K nearest neighbor ?

  • @MrElectrecity
    @MrElectrecity 2 года назад +1

    Please check the Auto Save in file drop down list it's really time saver 😃
    I need to see the video many times to understand what are you doing
    But great work
    I love all what you do
    Thumb up 👍👍

  • @quadropheniaguy9811
    @quadropheniaguy9811 2 года назад +2

    Could you please upload correct code to github? I lost track of your logic after "def descend () etc".

    • @NicholasRenotte
      @NicholasRenotte  2 года назад +1

      Correct code is on there @Quadrophenia, not working?

  • @ShiftKoncepts
    @ShiftKoncepts 10 месяцев назад

    Does gradient descent work for polynomial with multi-variable problems?

  • @rverm1000
    @rverm1000 Год назад

    where is it used? why?

  • @ДимаДмитрий-е1к
    @ДимаДмитрий-е1к 2 года назад

    👍👍👍

  • @carlosvasquez-xp8ei
    @carlosvasquez-xp8ei 2 года назад

    Great video. Set time to 20 mins.

  • @lakshman587
    @lakshman587 2 года назад +1

    Gift card not valid :(
    But it was fun!
    You are amazing!!

    • @NicholasRenotte
      @NicholasRenotte  2 года назад

      Got claimed super fast this time @Lakshman!!

    • @lakshman587
      @lakshman587 2 года назад +2

      @@NicholasRenotte My bad
      I have turned on the notification of your channel!
      Waiting for the next code that challenge!!!!
      Hope you win next time! 🤞🤞🤞

  • @nikaize
    @nikaize 10 месяцев назад

    😂😂😂

  • @schubiduba1
    @schubiduba1 Год назад

    Was too fast for me

  • @philtoa334
    @philtoa334 2 года назад +1

    Vert nice

    • @NicholasRenotte
      @NicholasRenotte  2 года назад +2

      HEYYYYY PHIL!! Long time no see, thanks a mil!!

  • @Harshadswe123
    @Harshadswe123 2 года назад

    I can do this more efficiently

  • @winglight2008
    @winglight2008 2 года назад

    Where's my $50 gift card? Lol

  • @nuke4496
    @nuke4496 Год назад

    LOLLLL

  • @efeshenem18
    @efeshenem18 4 дня назад

    hi. i have a question. why you didn't use y = 2*x + np.random.rand(10,1). in your version we will have 10 points that are lye on a line with fix slope. with this y = 2*x + np.random.rand(10,1) every point will have different slope and intercept.

  • @aso_o.111
    @aso_o.111 2 года назад +2

    You can contact us on telegram