Nogunumo
Nogunumo
  • Видео 19
  • Просмотров 4 878

Видео

Convolutional Neural Network (CNN) in C++: Built from Scratch - No Libraries!
Просмотров 264День назад
Apologies if the microphone volume is a bit quiet in this video-I’ll make adjustments for future uploads. Please consider turning up the volume for the best experience. Thank you! Welcome to the channel! I'm exploring AI and machine learning technologies, sharing insights, updates, and behind-the-scenes looks at my development process. If you’re into AI, machine learning, or tech innovation, th...
Devlog #11: [CNN] Implementing Deconvolution and MaxUnpool() - A Game Changer!
Просмотров 20414 дней назад
Welcome to the channel! I'm exploring AI and machine learning technologies, sharing insights, updates, and behind-the-scenes looks at my development process. If you’re into AI, machine learning, or tech innovation, this is the place for you. 0:00 - Intro 3:10 - How Deconvolution() works 7:28 - How MaxUnPool() works 11:56 - Outro #machinelearning #deeplearning #forwardpropagation #backpropagatio...
Devlog #10: [CNN] The convolution() Function Was Broken… Here's What Happened!
Просмотров 6521 день назад
In this devlog, I dive into a bug fix I made to my convolution() function - turns out, I was misunderstanding how shapes work! Welcome to the channel! I'm exploring AI and machine learning technologies, sharing insights, updates, and behind-the-scenes looks at my development process. If you’re into AI, machine learning, or tech innovation, this is the place for you. 0:00 - Intro 0:07 - Removed ...
AI Devlog #9: Finally Started Training LeNet in Batches - Say Goodbye to Batch Gradient Descent!
Просмотров 154Месяц назад
Happy New Year! 🎆 In this devlog, I dive into training AI models in batches-a straightforward approach I hadn't explored until now. Welcome to the channel! I'm exploring AI and machine learning technologies, sharing insights, updates, and behind-the-scenes looks at my development process. If you’re into AI, machine learning, or tech innovation, this is the place for you. 0:00 Intro 0:12 Gradien...
AI Devlog #8: Backpropagation in LeNet + Key Improvements!
Просмотров 148Месяц назад
Devlog video about how I compute gradients in LeNet. In my opinion, the gradients of max pooling and convolution can be tricky. Welcome to the channel! I'm exploring AI and machine learning technologies, sharing insights, updates, and behind-the-scenes looks at my development process. If you’re into AI, machine learning, or tech innovation, this is the place for you. 0:00 Intro 1:04 Computing G...
AI Devlog #7: Working on LeNet, and some improvements.
Просмотров 2412 месяца назад
In this video, I share an update on my recent work. I think 3-dimensional data in CNNs is quite cumbersome to handle. Welcome to the channel! I'm exploring AI and machine learning technologies, sharing insights, updates, and behind-the-scenes looks at my development process. If you’re into AI, machine learning, or tech innovation, this is the place for you. 0:00 Intro 0:15 Removed class-based m...
Recurrent Neural Network: Gated Recurrent Unit (GRU) Built from Scratch in C++!
Просмотров 1212 месяца назад
This time, I learned so much about sequential models, especially while explaining the concepts in detail! It's definitely faster than LSTMs. Enjoy the video! 0:00 Intro 1:13 Vanishing and exploding gradients 2:58 Preprocessing 3:23 Initialize the parameters in the constructor 4:46 forward() function 8:15 BPTT 9:06 Demo 11:02 Outro #machinelearning #deeplearning #forwardpropagation #backpropagation
Long Short-Term Memory (LSTM): Built from Scratch in C++!
Просмотров 2053 месяца назад
I thought it would take more time to implement, but it didn't. I guess that's because it's essentially one of the RNNs, just a more advanced version. One thing to note is that I still had to use the Adam optimizer, which I think was due to the short sequence length I chose. Additionally, I had to slice the weights to focus only on the portions that contributed to the hidden states and to ensure...
Recurrent Neural Network: Built from Scratch in C++!
Просмотров 8923 месяца назад
Finally, I implemented it! Everything was challenging-data preparation, forward and backpropagation, and especially the optimizer. It just wouldn't work without Adam! I guess that's why LSTMs were invented. Enjoy the video! 0:00 Intro 0:34 Preprocessing 1:09 Prepare x and y 3:11 Forward propagation 6:59 BPTT 11:11 Demo 19:40 Outro #recurrentneuralnetwork #machinelearning #deeplearning #forwardp...
AI Devlog #6: Fixing Loss Calculation
Просмотров 1093 месяца назад
Before the fix, the loss fluctuated a lot, but when I incorporated batch sizes into the loss calculation, the losses began to decrease more smoothly. While it still fluctuates, this is not due to miscalculations; rather, it's because the model isn't perfectly generalized to the dataset, which I need to improve in the future. Enjoy the video!
Seq2seq: TextVectorization!
Просмотров 1187 месяцев назад
Hey everyone! I'm building the AGI. Feel free to drop your questions in the comments, and I'd love to hear your thoughts on the process! 0:00 Intro 1:52 How it works 5:33 Demo 9:06 Outro
AI Devlog #5: Refactoring the Training Code
Просмотров 160Год назад
I am implementing machine learning models in C and CUDA. In today's video, I will explain the regularizations I added to the model. Please share any questions or suggestions below. Hope you enjoy! 0:00 Intro 1:10 Refactorization for forward propagation 3:02 Refactorization for logging metrics 5:48 Refactorization for parameter initializations 7:53 Backpropagation 16:00 Using only one hidden lay...
AI Devlog #4: L1L2 Regularization!
Просмотров 237Год назад
I am implementing machine learning models in C and CUDA. In today's video, I will explain the regularizations I added to the model. Please share any questions or suggestions below. Hope you enjoy! 0:00 Introduction 1:39 Realization of immature 2:48 Benefits of stopped writing code in NumPy 7:45 Explanation of L2Regularizations 18:28 Demonstrate the consequences of introducing regularization 20:...
AI Devlog #3: Gradient Clipping!
Просмотров 253Год назад
I am implementing machine learning models in C and CUDA. In today's video, I will explain the gradient clipping I added to the model. Please share any questions or suggestions below. Hope you enjoy! 0:00 Introduction 1:08 Gradient clipping explanation 2:35 Assessing model's performance 3:24 I need to decrease number of hidden layer 8:43 Removed hyperparameters.h 11:02 Future updates
AI Devlog #2: Adding momentum to the model
Просмотров 170Год назад
AI Devlog #2: Adding momentum to the model
AI Devlog #1: New activation and loss functions!
Просмотров 102Год назад
AI Devlog #1: New activation and loss functions!
Neural Network: Overview 2
Просмотров 1,3 тыс.Год назад
Neural Network: Overview 2
Neural Network: Overview 1
Просмотров 192Год назад
Neural Network: Overview 1

Комментарии

  • @jager3339
    @jager3339 Месяц назад

    Happy New Year!

    • @nogunumo
      @nogunumo Месяц назад

      Thank you! Same to you!

  • @HeadshotChen
    @HeadshotChen Месяц назад

    Happy new year Nogu!

    • @nogunumo
      @nogunumo Месяц назад

      Thank you! Happy New Year to you too!

  • @SomeRandomDudeAF
    @SomeRandomDudeAF 3 месяца назад

    Some points on your video that I at least think you should think about if you make more 1. Lose the glasses if possible, makes you look... not serious. 2. Think about getting a better mic 3. If this video is only to showcase what you have built, i guess thats fine but either you need some more entusiasm in your voice to have such a lenghty video to keep people in it or edit it ( i world prefer you do that either way) in a way where you get to the point faster. If you are unsure how to entertain or be engageing maby shorter clips is easier to handle? 4. Even if this only is a showcase there is alot of code to go through and will quickly get overwelming for someone to get into. This is where some powerpoint could have been nice to have going through the math. 5. If this was ment to be a tutorial, do not show all the code directly. It is the worst way to teach someone how something works. Building it up while explaining what you are doing and why it works. It is good for you to get a deeper knowlage of it all and a learning experience for yourself. Keep working on it, as someone that also built DNN from scratch in C# i know this is complex sometimes. But it is fun and that is why we do it Also I'm Swedish so excuse my English😂

  • @ВадимПоляков-т4ш
    @ВадимПоляков-т4ш 3 месяца назад

    Finally! Someone made a tutorial of creating more than simple MLP in C++!!! You are the GOAT, man! Thank You!

    • @nogunumo
      @nogunumo 3 месяца назад

      Thanks! Really glad you found it helpful-means a lot!

  • @HeadshotChen
    @HeadshotChen 6 месяцев назад

    Cool thumbnail Nogu! So good to see you upload