The StatQuest Introduction to PyTorch

Поделиться
HTML-код
  • Опубликовано: 1 июн 2024
  • PyTorch is one of the most popular tools for making Neural Networks. This StatQuest walks you through a simple example of how to use PyTorch one step at a time. By the end of this StatQuest, you'll know how to create a new neural network from scratch, make predictions and graph the output, and optimize a parameter using backpropagation. BAM!!!
    To learn more about Lightning: lightning.ai/
    The code demonstrated this video can be downloaded here:
    lightning.ai/lightning-ai/stu...
    This StatQuest assumes that you are already familiar with...
    Neural Networks: • The Essential Main Ide...
    Backpropagation: • Neural Networks Pt. 2:...
    The ReLU Activation Function: • Neural Networks Pt. 3:...
    Tensors: • Tensors for Neural Net...
    To install PyTorch see: pytorch.org/get-started/locally/
    To install matplotlib, see: matplotlib.org/stable/users/g...
    To install seaborn, see: seaborn.pydata.org/installing...
    For a complete index of all the StatQuest videos, check out...
    app.learney.me/maps/StatQuest
    ...or...
    statquest.org/video-index/
    If you'd like to support StatQuest, please consider...
    Buying The StatQuest Illustrated Guide to Machine Learning!!!
    PDF - statquest.gumroad.com/l/wvtmc
    Paperback - www.amazon.com/dp/B09ZCKR4H6
    Kindle eBook - www.amazon.com/dp/B09ZG79HXC
    Patreon: / statquest
    ...or...
    RUclips Membership: / @statquest
    ...a cool StatQuest t-shirt or sweatshirt:
    shop.spreadshirt.com/statques...
    ...buying one or two of my songs (or go large and get a whole album!)
    joshuastarmer.bandcamp.com/
    ...or just donating to StatQuest!
    www.paypal.me/statquest
    Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:
    / joshuastarmer
    0:00 Awesome song and introduction
    1:38 Coding preliminaries
    2:15 Creating a neural network in PyTorch
    7:54 Graphing the neural network's output
    10:47 Optimizing a parameter with backpropagation
    #StatQuest #NeuralNetworks #PyTorch

Комментарии • 340

  • @statquest
    @statquest  2 года назад +25

    The code demonstrated this video can be downloaded here: lightning.ai/lightning-ai/studios/statquest-introduction-to-coding-neural-networks-with-pytorch?view=public§ion=all
    To learn more about Lightning: lightning.ai/
    This StatQuest assumes that you are already familiar with...
    Neural Networks: ruclips.net/video/CqOfi41LfDw/видео.html
    Backpropagation: ruclips.net/video/IN2XmBhILt4/видео.html
    The ReLU Activation Function: ruclips.net/video/68BZ5f7P94E/видео.html
    Tensors: ruclips.net/video/L35fFDpwIM4/видео.html
    To install PyTorch see: pytorch.org/get-started/locally/
    To install matplotlib, see: matplotlib.org/stable/users/getting_started/
    To install seaborn, see: seaborn.pydata.org/installing.html
    Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/

    • @yongjiewang9686
      @yongjiewang9686 2 года назад +5

      REALLY Hope you can continue with this PyTorch tutorial.

    • @statquest
      @statquest  2 года назад +1

      @@yongjiewang9686 Will do!

    • @shichengguo8064
      @shichengguo8064 Год назад

      Do we have video talking about transformer? Thanks.

    • @statquest
      @statquest  Год назад

      @@shichengguo8064 Not yet, but soon.

    • @Mayur7Garg
      @Mayur7Garg Год назад

      Just a small comment. Any variable should not be named similar to any builtin in Python. The 'input' variable in forward should have been called something else since it is already a builtin function in Python. Otherwise, you end up overriding the builtin within that scope.

  • @santoshmohanram536
    @santoshmohanram536 2 года назад +107

    Favorite teacher with my favorite Deep learning framework. Lucky to have you. Thanks brother🙏

  • @firesongs
    @firesongs 2 года назад +48

    Please continue to go through every single line of code including the parameters with excruciating detail like you do.
    None of my professors went over each line like that cuz they always "assumed we already knew" and everyone in the class who didnt already know was afraid to ask to avoid looking stupid. Thank you.

  • @gummybear8883
    @gummybear8883 2 года назад +9

    What a blessing this is. You are indeed the Richard Feynman of Data Science.

  • @insushin6139
    @insushin6139 3 месяца назад +4

    StatQuest is the GOAT in statistics, machine learning, and deep learning! You're videos are really helping me understanding the concepts and outline of these fields! Love from Korea!

  • @youlahr7589
    @youlahr7589 2 года назад +27

    Ive used PyTorch for projects before, but I can honestly say that I never fully understood the workings of building a model. I knew that i needed the peices you mentioned, but not why I needed them. You've just explained it incredibly. Please don't stop making this series!!

    • @statquest
      @statquest  2 года назад +4

      Thank you very much! :)

  • @footballistaedit25
    @footballistaedit25 2 года назад +29

    Thanks for the best content you bring. I hope you continue to make a full pytorch playlist

  • @karlnikolasalcala8208
    @karlnikolasalcala8208 7 месяцев назад +2

    YOU ARE THE BEST TEACHER EVER JOSHH!! I wish you can feel the raw feeling we feel when we watch your videos

  • @jamilahmed2926
    @jamilahmed2926 8 месяцев назад +2

    I have lived long enough to watch videos and understand nothing about ML stuffs, until I saw your videos. I truly wish your well being

  • @jonnhw
    @jonnhw 2 года назад +3

    Was looking for a pytorch resource and was disappointed when this channel didnt have one yet but then this got uploaded. Really a blessing to the people haha

  • @frederikschnebel2977
    @frederikschnebel2977 2 года назад +4

    Thanks so much for this gem John! Literally got a PyTorch project coming up and your timing is just perfect. Greatly appreciate the content, keep up the good work :)

  • @sapnasharma4476
    @sapnasharma4476 2 года назад +2

    Thanks for the awesome tutorial! You make the most difficult things so easy to understand, specially with the visuals and the arrows and all! The comments written on the right hand side make it so more helpful to pause and absorb. I would never miss a video of your tutorials!

    • @statquest
      @statquest  2 года назад

      Hooray! I'm glad you like my videos. :)

  • @binhle9475
    @binhle9475 Год назад +5

    AMAZING video. This is exactly what beginners need to start the Pytorch journey with a semi solid footing instead of mindless copying.
    Yoy must have spent so much time for your AWESOME videos.
    GREATLY appreciate your effort. Keep up the good work.

    • @statquest
      @statquest  Год назад +2

      Thank you very much! :)

  • @AlbertsJohann
    @AlbertsJohann 2 года назад +1

    What a great feeling when it all clicks after learning about all these concepts in isolation. All thanks to an incredibly brilliant teacher! Triple BAM!!!

    • @statquest
      @statquest  2 года назад

      Hooray!!! Thank you!

  • @anashaat95
    @anashaat95 Год назад +1

    This series about neural networks and deep learning is very well explained. Thank you soooooooo much.

  • @kaanzt
    @kaanzt 9 месяцев назад +1

    That's really cool explanation! Please continue this PyTorch series, we really need it. BAM!

  • @jessicas2978
    @jessicas2978 2 года назад +2

    Thank you so much, Josh. I have been learning PyTorch and deep learning. This video helps me a lot!

  • @the_real_cookiez
    @the_real_cookiez 2 года назад +2

    Quality educational content! It's so cool to see your channel grow. Been here since ~90k subs! Very well earned.

    • @statquest
      @statquest  2 года назад +2

      Wow! Thank you very much!!! BAM! :)

  • @veronikaberezhnaia248
    @veronikaberezhnaia248 2 года назад +3

    Amazing content, as always. Before I was a bit afraid to start closing in torch, so thank you to encourage le to do that!

    • @statquest
      @statquest  2 года назад

      bam! You can do it! :)

  • @amirhosseinafkhami2606
    @amirhosseinafkhami2606 2 года назад +4

    Great explanation as always! Thanks for making content like this, which complements the theoretical concepts.

    • @statquest
      @statquest  2 года назад +1

      Glad you liked it!

  • @Ajeet-Yadav-IIITD
    @Ajeet-Yadav-IIITD 2 года назад +2

    Thank you Josh, pls continue this series of pytorch!

  • @viveksundaram4420
    @viveksundaram4420 2 года назад +3

    Man, you are love. I started my neural net journey from your videos and it's the best decision I made. Thank you

  • @xedvap
    @xedvap Год назад +1

    Looking forward to seeing your following videos! Excellent explanation!

  • @mahammadodj
    @mahammadodj Год назад +1

    Thank you very much! I am new to Deep Learning. I can say that just in one week i learned a lot of things from your tutorials!

  • @kleanthivoutsadaki5989
    @kleanthivoutsadaki5989 Год назад +1

    thanks Josh, you really make understanding Neural Networks concepts a great process!

  • @ais3153
    @ais3153 2 года назад +2

    BIG LIKE before watching 👍🏻 please continue the pytorch series

  • @nicolasreinaldet732
    @nicolasreinaldet732 2 года назад +2

    Guess who was going to start programing a neural network in python today......
    God bless you Josh, becase He know how much you are blessing me with your work.
    And know that Jesus loves you and want to be part of your life.

  • @StratosFair
    @StratosFair 2 года назад +1

    Nice video, looking forward to the next ones on Pytorch Lightning !

  • @exxzxxe
    @exxzxxe Год назад +1

    Another charming, fully informative masterpiece.

    • @statquest
      @statquest  Год назад

      Thank you very much! BAM! :)

  • @kwang-jebaeg2460
    @kwang-jebaeg2460 2 года назад +2

    Wonderful !!! Cant wait your pytorch lightning code for NN. Always thanks alot !!

  • @yashsurange7648
    @yashsurange7648 Год назад +1

    Thanks for this amazing walk through.

  • @_epe2590
    @_epe2590 2 года назад +3

    finally! some simple to understand content on how to make an AI model using pytourch!!! TRIPLE BAM!!!!

  • @MariaHendrikx
    @MariaHendrikx 6 месяцев назад +1

    I love how you you visualize and synchronize the code with the maths behind it :) On top of that you are doing it step-wise which results in a really awesome and very eduSupercalifragilisticexpialidociouscational video! #ThankYou

    • @statquest
      @statquest  6 месяцев назад

      I love it. Thank you very much! :)

  • @justinhuang8034
    @justinhuang8034 2 года назад +1

    Man the content keeps getting better

  • @someone5781
    @someone5781 2 года назад +1

    Woo! Been waiting for this sort of a tutorial!!!

  • @AHMAD9087
    @AHMAD9087 2 года назад +4

    The tutorial we all needed 🙂

  • @theblueplanet3576
    @theblueplanet3576 3 месяца назад +1

    Enjoying this series on machine learning. By the way there is no shame in self promotion, you deserve it 😁

  • @Sandeepkumar-dm2bp
    @Sandeepkumar-dm2bp Год назад +1

    very well explained, thank you for providing quality content, it's very helpful

  • @aabshaarahmad7853
    @aabshaarahmad7853 Год назад +1

    Hi! This is amazing. Are you gonna continue this series? Out of ten different rabbitholes I have been to, this video has been the most helpful for me with understanding PyTorch and starting off with my project. Please continue making more complicated models. Thank you :)

  • @stivraptor
    @stivraptor Год назад +1

    Hey Josh!
    Guess what just arrived in the mail....
    My new statquest mug!!!!!
    Hooray!!!

    • @statquest
      @statquest  Год назад

      BAM!!! Thank you so much for supporting StatQuest!!!

  • @aayushjariwala6256
    @aayushjariwala6256 2 года назад +4

    It amazes me, when I see no NLP video on StatQuest! Josh your explanation are always higher than what one can expect and you have created so many series including maths and conceptual understanding. NLP has the same importance compared to computer vision and actually people are suffering to learn it by lack of content availability! I hope you would create a series or maybe a few videos on basic concepts which help people to get interested in NLP : ) Hope you are doing good in life Josh

    • @statquest
      @statquest  2 года назад +5

      I'm working on NLP.

    • @vans4lyf2013
      @vans4lyf2013 2 года назад +3

      @@statquest Yay so glad to hear this, we really need you because no one gives great explanations like you do. Also your youtube comments are the nicest I've ever seen which is a testament to how valued you are in this community.

    • @statquest
      @statquest  2 года назад

      @@vans4lyf2013 Thank you very much!

  • @carlitosvh91
    @carlitosvh91 27 дней назад +1

    Great explanation. Thank you very much

  • @praptithapaliya6570
    @praptithapaliya6570 Год назад +1

    I love you Josh. God bless you. You're my favorite teacher.

  • @Luxcium
    @Luxcium 2 месяца назад +1

    I am someone who loves *SQ,* and *JS* style of teaching in byte 😅 pieces but I also hate _snakes…_ I love *JavaScript* and *TypeScript* but I’ve been learning *JavaScript* with the _strictest linting rules_ one would imagine… and given how *JavaScript* could be used without any sort of strict rules (and is very similar to *Python* in this context) it is frustrating that it makes *Python* very hard to understand despite being easier since it has not the same stricter rules I have imposed myself learning *JavaScript…* but I am also genuinely grateful that *JS* is the best instructor for this kind of topics because *JS* has a _Ukulele,_ *StatSquatch* and *Normalsaurus* which are all there to help *JS* make *SQ* awesome 🎉🎉🎉🎉 Thanks 😅😅😅❤

  • @darshuetube
    @darshuetube Год назад +1

    it's great that you are making videos on coding as well.

  • @mohsenmoghimbegloo
    @mohsenmoghimbegloo 2 года назад +1

    Thank you very much Mr Josh Starmer

  • @Luxcium
    @Luxcium 11 месяцев назад

    Wow 😮 I didn't knew I had to watch the *Neural Networks part 2* before I can watch the *The StatQuest Introduction To PyTorch* before I can watch the *Introduction to coding neural networks with PyTorch and Lightning* 🌩️ (it’s something related to the cloud I understand)
    I am genuinely so happy to learn about that stuff with you Josh❤ I will go watch the other videos first and then I will back propagate to this video...

  • @arer90
    @arer90 2 года назад +3

    Thank you for perfect lecture~!!!

  • @onkarpandhare
    @onkarpandhare 2 года назад +1

    great video! very well explained!!!👍👍

  • @anggipermanaharianja6122
    @anggipermanaharianja6122 2 года назад +3

    Awesome vid by the legend!

  • @isaacpeng3625
    @isaacpeng3625 2 года назад +1

    great video and explanation! me have been struggling in pytorch coding

  • @jwilliams8210
    @jwilliams8210 Год назад +2

    Absolutely brilliant!

  • @bjornnorenjobb
    @bjornnorenjobb 2 года назад +1

    omg! I have really wanted this! awesome!!! :) :) :)

  • @ISK_VAGR
    @ISK_VAGR 2 года назад

    That is a big leap. I need to check it several times to understand it since I am not a programmer. However, I really got a good feeling of what is happening inside the code. I actually use codeless systems such as KNIME. So if Mr. Sasquatch, get the idea of using KNIME to explain all this, It will be amazing. Thanks to be such a good teacher.

    • @statquest
      @statquest  2 года назад

      I'll keep that in mind.

  • @jawadmansoor6064
    @jawadmansoor6064 Год назад +1

    Great series.

  • @sceaserjulius9476
    @sceaserjulius9476 2 года назад +1

    I am also learning Deep Learning, and want to apply it to make good projects,
    This is going to be great.

  • @Irrazzo
    @Irrazzo Год назад

    Thank you, good explanation!
    16:00 Python prefers for-each-loops over index-based loops. See how this equivalent for-each loop looks much simpler.
    for input, label in zip(inputs, labels):
    output = model(input)
    loss = (output - label)**2
    loss.backward()
    total_loss += float(loss)

  • @BrianBin
    @BrianBin 2 года назад +1

    Your teaching video is awesome

    • @statquest
      @statquest  2 года назад

      Thank you!

    • @BrianBin
      @BrianBin 2 года назад

      @@statquest Do you have intro to lightning ? I kind of remember you mentioned in the video you seemed to have one?

    • @statquest
      @statquest  2 года назад +2

      @@BrianBin That's going to be the next video in this series. It will come out in a few weeks.

  • @mohammadalikhani7798
    @mohammadalikhani7798 8 месяцев назад +1

    That Was Nice ! Thank You

    • @statquest
      @statquest  8 месяцев назад

      Glad you liked it!

  • @Hitesh10able
    @Hitesh10able 8 месяцев назад

    Another excellent video, one humble request please provide video on Stable Diffusion Models.

    • @statquest
      @statquest  8 месяцев назад

      I'll keep that in mind.

  • @whispers191
    @whispers191 2 года назад +1

    Thank you Josh!

  • @lloydchan9606
    @lloydchan9606 2 года назад +2

    bless josh and this channel

  • @WeeeAffandi
    @WeeeAffandi 2 года назад +1

    Josh explaining the code is far better than any programmer

  • @emilyli6763
    @emilyli6763 2 года назад +2

    honestly wish I had this a year ago when I was struggling, still watching now tho!

  • @neginamirabadi4595
    @neginamirabadi4595 2 года назад +2

    Hello Josh! Thank you so much for your amazing videos! I have learned so much from your tutorials and would not have been able to advance without them!
    I wanted to ask whether it is possible for you to put some videos on times series analysis, including autoregression (AR), moving average (MA) and their combinations. I would be more than grateful if you can provide such a video. Thank you so much.

    • @statquest
      @statquest  2 года назад +1

      I'll keep those topics in mind!

  • @karrde666666
    @karrde666666 2 года назад +2

    better than MIT or any university slides

  • @sanjaykrish8719
    @sanjaykrish8719 2 года назад +1

    wow.. super excited

  • @05747100
    @05747100 2 года назад +1

    Thanks a lot, beg for Pytorch Series playlist.

  • @pfever
    @pfever Год назад +1

    Best tutorial like usual! would be nice to see more advanced examples of in pytorch, like CNN for image classification :)

  • @petercourt
    @petercourt 2 года назад +1

    Amazing work Josh!

  • @arijitchaki1884
    @arijitchaki1884 4 месяца назад

    Hi Josh, sorry to be a spoil sport, but I used exact same code and my prediction is showing 0.5 for dosage of 0.5 and it is running for all 100 epoch and final b value comes out to be -16.51 😔. But yes the concept is clear!! Great work! I always ask people whoever are interested in learning about data science or machine learning to refer you channel. Seeing your channel grow from 10-20K to a Mn is pleasure to my eyes!! You are the "El Professor"!!

    • @statquest
      @statquest  3 месяца назад +1

      Thank you very much! If you look at my actual code (follow the link), you'll see that I actually pulled a trick with the data to get it to train faster.

  • @peterkanini867
    @peterkanini867 11 месяцев назад

    Please make an entire tutorial about the ins and outs of PyTorch!

    • @statquest
      @statquest  11 месяцев назад

      I've made several PyTorch videos and will continue to make more. You can find the others here: statquest.org/video-index/

  • @AkmKawserOfficial
    @AkmKawserOfficial Год назад +1

    Really Awesome

  • @joshstat8114
    @joshstat8114 3 месяца назад +1

    Nice video for the introduction of LSTM using PyTorch. There is also `torch` R package that doesn't need to install python and torch. It's so nice that R also has deep learning framework aside from `tensorflow` and I recommend you to maybe try it.

    • @statquest
      @statquest  3 месяца назад

      Thanks for the info!

    • @joshstat8114
      @joshstat8114 3 месяца назад +1

      @@statquest i strongly recommend it because it is so nice that R has own deep learning frameworks, besides h2o

  • @abbddos
    @abbddos 2 года назад +1

    This was great... I hope you can simplify Tensorflow the same way... big big thank you.

  • @junechu9701
    @junechu9701 Год назад +1

    Soooooooo thankful!

  • @animegod567
    @animegod567 2 года назад +1

    Thanks for this

  • @carol8099
    @carol8099 2 года назад +1

    this video is gold

  • @kobic8
    @kobic8 Год назад

    great presentation!! thanks again for simplfying this topic! are you planning to post more on NN implementation? computer vision maybe or object detection?

    • @statquest
      @statquest  Год назад +1

      Yes, there will be many more videos on how to implement NNs.

  • @guramikeretchashvili1569
    @guramikeretchashvili1569 2 года назад

    So interesting videos and good explanations. I am wondering which software you use to make these cool visualizations?

    • @statquest
      @statquest  2 года назад

      I share all my secrets here: ruclips.net/video/crLXJG-EAhk/видео.html

  • @user-ej1nj5ry6l
    @user-ej1nj5ry6l 2 года назад +1

    KOREAN BAMMMM!!! TY StatQuest😁

  • @vladimirbosinceanu5778
    @vladimirbosinceanu5778 2 года назад +1

    omg this is amazing

  • @aiforeveryone2941
    @aiforeveryone2941 2 года назад +1

    Double bam new way to teach coding

  • @getingbored
    @getingbored 9 месяцев назад +1

    Thank you :)

    • @statquest
      @statquest  9 месяцев назад

      You're welcome!

  • @wert572
    @wert572 Год назад +1

    Hi Josh, thank you for introducing pytorch to me. I have an off topic question. How do you create your videos? They look like a series of animated slides. I want to emulate your style for creating presentation slides.

    • @statquest
      @statquest  Год назад

      I give away all of my secrets in this video: ruclips.net/video/crLXJG-EAhk/видео.html

  • @jianzhen3
    @jianzhen3 Год назад +1

    thank you so much to explained it so clearly, if I didn't click the sumb up button, that will be my guilty

  • @Luxcium
    @Luxcium 2 месяца назад +1

    So I paused at 18:04 because _it blew my mind_ that we were calling backwards() on the loss variable because I thought it was defined on the line above… 😅😅😅😅 but yeah I didn’t find anything so one hour later I was just watching the rest of the video and _to be honest_ in about 33 seconds it came out that it was normal for _my mind to be blown_ 😂😂😂😂 at 18:37

    • @statquest
      @statquest  2 месяца назад

      Totally! I was like, "what?!?!?" when I first saw that.

  • @minglee5164
    @minglee5164 2 года назад +1

    Wonderful

  • @sabaaslam781
    @sabaaslam781 Год назад +1

    Hi Josh. I am a big fan of your videos. I have a question regarding this quest. In this video, we optimized only one parameter. How can we optimize all the parameters? Thanks in advance.

    • @statquest
      @statquest  Год назад

      I show how to impute all of the parameters in this video on LSTMs in PyTorch: ruclips.net/video/RHGiXPuo_pI/видео.html (if you want to learn about the theory of LSTMs, see: ruclips.net/video/YCzL96nL7j0/видео.html

  • @HtHt-in7vt
    @HtHt-in7vt 2 года назад

    I would be appreciated if you can teach more an deeper in pytorch. Thank you so much!

    • @statquest
      @statquest  2 года назад +2

      That's the plan. This is just the first of many videos on how to code neural networks. The next video will be on pytorch lightning, and then we'll start to create more advanced models.

  • @andreblanco
    @andreblanco 2 года назад +3

    Triple bam!

    • @statquest
      @statquest  2 года назад

      BAM! Thank you very much for supporting StatQuest!!!!

  • @conlele350
    @conlele350 2 года назад +1

    Thanks Josh, its Incredible video. Beside, recently the Bayes theorem application in fitting model (linear, logistic, random forest...) has became more and more popular in order to replace classic statistic method, could you pls take some time to explain to us some of its popular algorithm like BART, Linear regression via Bayesian Methods...

    • @statquest
      @statquest  2 года назад +3

      I'm planning on doing a whole series on Bayesian stuff as soon as I finish this series on neural networks.

    • @conlele350
      @conlele350 2 года назад +2

      @@statquest that's great news for today, thanks Josh, Im looking forward to see it soon

  • @random-hj1gv
    @random-hj1gv 2 месяца назад

    Hello! I've got a bit confused: at 15:08 you mention that at each epoch we'll be running all 3 data points through the model, but wasn't the point of SDG in that we would only need a single data point per epoch, or am I misunderstanding something? Btw, despite my confusion, this is by far the best ML guide series I've seen, thank you for your work!

    • @statquest
      @statquest  2 месяца назад

      That's a good question. "torch.optim" doesn't have a gradient descent optimizer, just a stochastic gradient descent optimizer. So we import torch.optim.SGD and then pass it all of the residuals to get gradient descent.

    • @random-hj1gv
      @random-hj1gv 2 месяца назад +1

      @@statquest Makes sense, thank you for the clarification!

  • @aaronm6675
    @aaronm6675 2 года назад +1

    Finally🔥

  • @EmilyBoInvests
    @EmilyBoInvests Год назад

    Awesome video! Thanks, Josh! Can you please explain what super() does in the _init_()?

    • @statquest
      @statquest  Год назад

      Great question! So, we're making a new class that is derived from nn.Module, and nn.Module, is derived from something else, and all those things need to be initialized, so "super()" does that for us.

  • @vtphan2012
    @vtphan2012 2 года назад

    Hi Josh, did you look at and consider Keras before making this video? What do you think about it versus Pytorch?

    • @statquest
      @statquest  2 года назад +1

      To be honest, I've only ever worked with PyTorch.

  • @shamshersingh9680
    @shamshersingh9680 Месяц назад

    Hi Josh, thanks again for allowing me to break the ice between me and Pytorch. Everytime I see your videos, I wonder if my instructor could have taught us like this probably our lives must have been much simpler and happier. I have a small doubt here. In the example you have shown gradient training of only final bias. But in reality, all the weights will have to be trained during backpropagation. So when I try to initialise the all weights with random values and then train the model, I do not get the final weights as shown in the video. The code is as follows :-
    class BasicNN(nn.Module):
    def __init__(self):
    super().__init__()
    self.w00 = nn.Parameter(torch.randn(1), requires_grad = True)
    self.b00 = nn.Parameter(torch.randn(1), requires_grad = True)
    self.w01 = nn.Parameter(torch.randn(1), requires_grad = True)
    self.w10 = nn.Parameter(torch.randn(1), requires_grad = True)
    self.b10 = nn.Parameter(torch.randn(1), requires_grad = True)
    self.w11 = nn.Parameter(torch.randn(1), requires_grad = True)
    self.b_final = nn.Parameter(torch.randn(1), requires_grad = True)
    def forward(self, input):
    input_top_relu = input * self.w00 + self.b00
    input_bottom_relu = input * self.w10 + self.b10
    output_top_relu = F.relu(input_top_relu) * self.w01
    output_bottom_relu = F.relu(input_bottom_relu) * self.w11
    input_final_relu = output_top_relu + output_bottom_relu + self.b_final
    output = F.relu(input_final_relu)
    return output
    # Create an instance of the neural network
    model = BasicNN()
    # Print parameters
    print('Parameters before training')
    for name, param in model.named_parameters():
    print(name, param.data)
    # Define inputs and corresponding labels
    inputs = torch.tensor([0., 0.5, 0.1])
    labels = torch.tensor([0., 1.0, 0.])
    # Define a loss function
    criterion = nn.MSELoss()
    # Define an optimizer
    optimizer = optim.SGD(model.parameters(), lr=0.01)
    # Number of epochs for training
    epochs = 1000
    # Training loop
    for epoch in range(epochs):
    total_loss = 0
    # Forward pass
    output = model(inputs)
    # Compute the loss
    loss = criterion(output, labels)
    total_loss += loss
    # Backward pass
    loss.backward() # Compute gradients
    optimizer.step() # Update weights
    optimizer.zero_grad() # Clear previous gradients
    # Print loss every 100 epochs
    if (epoch + 1) % 100 == 0:
    print(f"Epoch [{epoch+1}/{epochs}], Loss: {loss.item()}")
    if (total_loss < 0.00001):
    print(f'Epoch = {epoch}')
    break
    # Print final parameters
    print('Parameters after training')
    for name, param in model.named_parameters():
    print(name, param.data)
    # check the model performance
    input_doses = torch.linspace(start = 0, end = 1, steps = 11)
    output = model(input_doses)
    sns.set(style = 'whitegrid')
    sns.lineplot(x = input_doses, y = output.detach(), color = 'green', linewidth = 2)
    plt.xlabel("Input Doses")
    plt.ylabel("Effectiveness")
    plt.show()
    Request if you can help me with the code above.

    • @statquest
      @statquest  Месяц назад +1

      This example only works to optimize the final bias term.

  • @junqichen6241
    @junqichen6241 2 года назад +1

    Hi Josh, could you explain how this line of code works? output_values = model(input_doses). My understanding is model has a method called forward, so shouldn't it be output_values = model.forward(input_doses)?

    • @statquest
      @statquest  2 года назад +1

      I answer that question at 9:13

  • @silveromayo4148
    @silveromayo4148 Год назад

    Thanks for the great video. Does this apply directly to GNN? Can I apply it there?

    • @statquest
      @statquest  Год назад

      To be honest, I don't know much about GNNs right now so I can't answer your question.

  • @vighneshsrinivasabalaji802
    @vighneshsrinivasabalaji802 6 месяцев назад +1

    Hey Josh!
    Amazing videos, thanks a lot.
    Would be great if you could cover Time Series Data and algorithms like ARIMA and HOLTS WINTER
    Thanks😊

    • @statquest
      @statquest  6 месяцев назад

      I'll keep those topics in mind.