Animating the learning process of a Neural Network

Поделиться
HTML-код
  • Опубликовано: 16 июл 2024
  • Let's animate the learning process of a Neural Network, by showing its regression surface with the data points over the epochs of the optimization problem. Here is the updated code: github.com/Ceyron/machine-lea...
    Timestamps:
    00:00 Recap: Neural Networks the sine function
    00:28 Create an Animation Object
    00:39 Define linearly spaced points for a line plot
    01:13 Initial Prediction of Network on the line
    01:32 Initial Plot consisting of data scatter and network line plot
    02:31 Reinitialize from the start
    03:18 Create frames as part of the training
    04:30 Creating a gif out of the animation
    04:40 Make title update current epoch number
    05:10 Discussion
    05:32 Outro
    -------
    📝 : Check out the GitHub Repository of the channel, where I upload all the handwritten notes and source-code files (contributions are very welcome): github.com/Ceyron/machine-lea...
    📢 : Follow me on LinkedIn or Twitter for updates on the channel and other cool Machine Learning & Simulation stuff: / felix-koehler and / felix_m_koehler
    💸 : If you want to support my work on the channel, you can become a Patreon here: / mlsim
    🪙: Or you can make a one-time donation via PayPal: www.paypal.com/paypalme/Felix...
    -------
    ⚙️ My Gear:
    (Below are affiliate links to Amazon. If you decide to purchase the product or something else on Amazon through this link, I earn a small commission.)
    - 🎙️ Microphone: Blue Yeti: amzn.to/3NU7OAs
    - ⌨️ Logitech TKL Mechanical Keyboard: amzn.to/3JhEtwp
    - 🎨 Gaomon Drawing Tablet (similar to a WACOM Tablet, but cheaper, works flawlessly under Linux): amzn.to/37katmf
    - 🔌 Laptop Charger: amzn.to/3ja0imP
    - 💻 My Laptop (generally I like the Dell XPS series): amzn.to/38xrABL
    - 📱 My Phone: Fairphone 4 (I love the sustainability and repairability aspect of it): amzn.to/3Jr4ZmV
    If I had to purchase these items again, I would probably change the following:
    - 🎙️ Rode NT: amzn.to/3NUIGtw
    - 💻 Framework Laptop (I do not get a commission here, but I love the vision of Framework. It will definitely be my next Ultrabook): frame.work
    As an Amazon Associate I earn from qualifying purchases.
    -------

Комментарии • 6

  • @soumilyade1057
    @soumilyade1057 Год назад +1

    Damn, it came at the right time...
    Hope I'd be able to use it

  • @huhuboss8274
    @huhuboss8274 Год назад +1

    Very cool!

  • @juliocardenas4485
    @juliocardenas4485 Год назад +1

    Wow !!

  • @dougiehwang9192
    @dougiehwang9192 Год назад +2

    I met Julia 2 years ago and learned little bit with Pluto. It is way cooler than python. I don't know why people aren't using it. I can't find any company using Julia in Korea. So sad. I am a fan of ur video since from the DAG. But Fluid dynamic videoes are way above my level(undergraduate level knowing Calculus, Linear algebra, Probability, Statistics). 😅
    Could you also make some videoes explaining intuitively and also math-rigorously about Jacobian matrix and FFT? 🙏
    Also coding videoes like this one are welcome 😁

    • @MachineLearningSimulation
      @MachineLearningSimulation  11 месяцев назад +2

      Hi,
      thanks so much for the kind comment! :) It took me way to long to reply to it. Unfortunately, my personal life has been very busy recently and I wanted to give you a thorough answer. I hope that I can return to my weekly upload schedule soon.
      Coming to your comment: I also love Julia. What amazes me so much is that Julia is holding up quite well against Python despite not having big tech companies pouring millions of dollars into very sophisticated packages. To some extent it truly solves the two-language problem. Let's see what the future holds. Just a couple of days ago, Julia entered the Top-20 of the Tiobe index.
      For my PhD research, I recently switched back to JAX, unfortunately. The higher sophistication of automatic differentiation and, in particular, the automatic vectorization (including kernel switches) sold it for me. I will still do videos on Julia, because I believe it is still better when it comes to large sparse iterative linear solvers and many other aspects. :D
      It is very nice to hear that you have been a fan of the channel from the early days. It's been quite a journey so far. I definitely want to come back to topics you mentioned: more on Jacobians and on FFT. They do also align with my research which is crucial for me because the fact of explaining these abstract topics truly helps me understand them. Unfortunately, I have to stop giving time estimates as to when I can upload a video on a certain topic. I have a terrible track record at this :D.
      Hope you still enjoy the videos and hope to see you again in the comment section under a future video :).