You blew my mind when you showed that the input can also be taken in as matrices. Cant believe im in final year of my degree and you are the one who showed me this. You deserve millions of subscribers.
this is really some high quality content, and an amazing explanation, sooner or later this channel will hit millions of subscribers, just keep the good work and don't slow down. good luck 🤞
A superb master of functional content. I simply consider myself a student by remote means. And where I have real task to solve in difficult case study. Mostly to image render. Thank you for the post where I gain from excellent teaching demonstration . M.
Thank-you! I am a practicing data scientist and even I found the explanation combined with the graphics super helpful!! You've earned a new subscriber. One question: At about 10:30, you show the Layer 1 input matrix. I am assuming that this is already transposed? In reality, the data would have a top row of 1's for the bias and a second row 0, 1, 0, 1 for the first instance??
Also, in NumPy you can use @ to explicitly use dot/matmul product for two arrays. Helps to keep it cleaner when there are several matmul's Thanks for the video, love the background music choice!
Man it's so difficult to follow NN tutorials that play around with notations. Messes up the whole thinking. Is there no standard for these notations? The ones that I follow for the NN that I am building are quite different so it becomes challenging to draw parallels.
@@KieCodes I mean when you have the outputs of the network, you can calculate the error and then apply a correction with, for exemple, gradient backpropagation (sorry, english is not my first language)
The best intro to neural nets on RUclips
This made my day. Thank you!
You blew my mind when you showed that the input can also be taken in as matrices. Cant believe im in final year of my degree and you are the one who showed me this. You deserve millions of subscribers.
You are more than welcome my friend. 🙏 Rock on! 🚀
Check out the 3brown1blue neural network series here in youtube, it does a great job explaining this math logic
True!
this is really some high quality content, and an amazing explanation, sooner or later this channel will hit millions of subscribers, just keep the good work and don't slow down. good luck 🤞
Thank you. 🙏
I agree… too bad you stopped the video, crystal clear explanations… Thanks for the videos
A superb master of functional content. I simply consider myself a student by remote means. And where I have real task to solve in difficult case study. Mostly to image render. Thank you for the post where I gain from excellent teaching demonstration . M.
You are more than welcome my friend. 🙏 Rock on! 🚀
Thank-you! I am a practicing data scientist and even I found the explanation combined with the graphics super helpful!! You've earned a new subscriber. One question: At about 10:30, you show the Layer 1 input matrix. I am assuming that this is already transposed? In reality, the data would have a top row of 1's for the bias and a second row 0, 1, 0, 1 for the first instance??
This is amazing it’s short and easy to understand
Thank you very much!
Very nice, thank you!
Love the content, I hope you have more coming in this series! Keep up the great work and amazing content.
Thank you! I have a video on back-propagation in the pipeline.
Great video. Loving this series!
Thank you. 🙏 That means a lot. I love creating them. ☺️
Also, in NumPy you can use @ to explicitly use dot/matmul product for two arrays.
Helps to keep it cleaner when there are several matmul's
Thanks for the video, love the background music choice!
Thank you. 🙏
Nice video!
Thank you. 🙏
Indeed, a great intro to the why and how of matrix multiplication in NNs. What software did you use to get the excellent supporting visuals?
Thank you. I use After Effects a lot.
I love the explanation. Thank you.
You are more than welcome my friend. 🙏 Rock on! 🚀
very helpful thankyou!
You are more than welcome my friend. 🙏
great stuff per usual! looking forward to more! Instant likes every time
Thank you. 🙏This means a great deal to me.
Vectorize... ALL THE THINGS!!!
You pretty much summarized this video. Well done. 🤣
Last time I was this early I wrote 50 lines of code without error.
I was never this early.
😂 You are amazing Igor, thank you for chiming in so early and giving the video some RUclips algorihtm gold.
@@KieCodes My pleasure, been watching you from the beginning :)
@@igornowicki29 I know my friend. Thank you so much. 🙏
remember me when you get a million subscribers
I will Hector!!! 🙏 Thank you for being here!
Man it's so difficult to follow NN tutorials that play around with notations. Messes up the whole thinking. Is there no standard for these notations? The ones that I follow for the NN that I am building are quite different so it becomes challenging to draw parallels.
Thank you for this video
You are welcome
Great video!
Thank you. 🙏
so finally, it's here
amazing...
I hope you like it. Thanks for being here right away! 🙏
Good job- well done!
Thank you! Glad you got something out of it.
Awesome 😎
You are awesome! 😘
Great video ! What about the correction ?
What correction?
@@KieCodes I mean when you have the outputs of the network, you can calculate the error and then apply a correction with, for exemple, gradient backpropagation (sorry, english is not my first language)
Yeah that is another video I have planned.
@@KieCodes Cool ! I can't wait :) I'm going to try to do it with what I learned with your video. See you !
any videos about back propagation ?
Not yet. But in the pipeline.
Did he just change the definition of parallel ?
Did I? 😅