Hi Daniel this video blew my mind ! Quick explanation : I'm 19 and I've been spending the last year learning linear algebra in maths lessons (vectorial spaces, matrix...) and for the first time, someone is giving me a practical use for all of this (very) theoric things, and it's furthermore related to my favourite subject ! Thank you a lot for all your work, it's awesome and it's enabling me to improve myself in IT, but also in english as I'm french :P
I want to say there are more common uses for matrix calculation which we (German) got teached in school. Most about equations which are related to real world problems.
Finding solutions for systems of linear equations, doing linear transformations to map between coordinate systems (this very channel has a series of videos where he used some linear algebra to get room scale point cloud data from a kinect sensor that basically outputs a matrix), electrical circuits, and I've even heard about using linear algebra to model real-life traffic flow. The way all this math is taught where I live (Brazil) is not optimal from the perspective of a student who just wants to know how to use it. To be honest, most of them won't anyway. But once you get a glimpse of how versatile this bit of maths is, everything just clicks in your mind and it's wonderful.
Dude, you're amazing as a teacher. First time ever I'd been able to properly understand this. Also super awesome video the one with the Markov Chains. Thank you for your work!!!
Amazing how the same topic teached by different persons can be either boring/ultra complicated as hell or can be so inspiring that you want to build something right away after the lesson. This guy falls immediately in the second category. It's like he can unlock something inside you which makes you understand something you never understood before.
This is really fascinating! I hear about things like this but I was never thinking I was going to have an understanding about how it works in my life! Now I'm starting to see that change way sooner than I would have ever thought it would! I don't really want to learn this from anyone else except you! This is probably one of the best ways to thoroughly teach and learn this!
These videos are awesome, Ive watched many other videos and i always ended up getting lost and basically just copying and pasting code and hoping it works right but with your videos I am actually understanding whats going on and am able to troubleshoot on my own if i do something wrong and figure out what i did incorrectly because I actually understand how its supposed to work. Great job!
I know he later corrects it one way, but at 17:30, if you want to keep the subscript of the weights as (row*col) while also keeping the row index as the input index, and the column index as the hidden layer index, you could do the transpose of the input matrix and multiply that (on the left) of the weight matrix. This would result in a 1 x n matrix for the output, but all the subscripts would work out in a way that's easier to understand
Great explanation of neural networks.Based on your lastly written code and Tariq Rashid code i've written my neural network library in c++ but with customized multi layers. I mean it's not great but it's working. Can't wait to see how the series continues.
are you going to eventually move onto evolving the overall topolgy (how the graph is connected) , for example the NEAT algorithim? (note: you used * for scalar multiplication and x for matrix and vector notation, if you are planning to cover the cross product you should probably avoid using x for multidimensional products that are associative...uh I mean use it only for the cross product)
Hey Dan, first of all I love your content and your personal way of engaging with your viewers! You have a very natural and captivating appearance. I would like to ask you and the rest of the audience a question. I'm hopefully starting on software development in august at a university, and I have just been wondering: Could there be any interest for the programming community to follow a fresh programmer, because I have been thinking about starting a Blog or a RUclips channel, where I would talk about the things we learn and some of the challenges. Is it interesting to learn with a new student or should I wait until I get more experienced? If it is in any way interesting, how can I prepare to start?
Thank you, I've just started to study machine translation. Your videos help me to understand it more deeply. I haven't seen the next videos yet but I hope you will also make one in which you will use Python3 and that you'll explain more maths like the sigmoid fonction, softmax,.. :).
You talk about the normalisation of inputs that are much larger than other inputs. Could this just be handled by adjustment of the weights to normalise the very large input?
How would you manage "degrees of freedom" or is there a feasible solution problem; before you dump data into a neural net which will try to fit data regardless? Great vids!
OMG. I thought that I will never understand it. But its was all about the teacher. Neural Networks is so easy. I think I will retake BBM406 Fund. of Mach. Learning lecture.. And I will write down the increase of my grade. When I see 'and or truth table' at first I was like 'Really nigga, wtf Im not gonna watch that', but after I watched, I got the concept. It really helped me thanx a lot.. :P
Excellent !! Keep it up but could you please tell me that how we will come to know to choose how many hidden neurons in hidden layers? I mean is there any formula?
Hey, is P5 support EcmaScript6? Specifically classes and inheritance. Because those are easier to teach junior students rather than objects in JSON format
11:46 You say you would need to normalize the inputs, is this really needed, as all the inputs are weighted. If the weight for house area is weighted by a really small number, lets say 0.00001. It would automatically get normalized, right?
Hi Daniel Schiffman! I have loads of questions from this video and several others I've watched of yours... I am 14, we were tought all of this linenear algebra last year in school, so i have no problem with that.I want to know why we use perceptrons and neural networks when we can do the same thing with linear and polynomial regression (I've tried it and it took very little time and processing power it also had a lower loss ).Also can we optimize the weights in a neural net using geneticic algorithms you described in a video series you have already made? please whoever can answer these questions...I have no idea who to ask and I dont trust stack exchange (somehow they arent allowing me to post anything there).Thank you for making this video, it really helped me, so did all the others I've watched.They have pulled me out of video game addiction...(I think) *ps (for every one who has had trouble with matrices) these videos on neural nets by you and fastai really help me ace my math tests...
@@TheCodingTrain Hi again! I understood that from the video but I want to know is - in what kind of situation would a neural network work better than polynomial regression ( because in my eyes they seem the same ).I hope I'm not bothering you too much... Also I am not making any reference to the videos content. cheers, -my_youtube_username!
Hello all, what I don't get is: what's the difference between a feed forward neural network trained with backpropagation and a MLP? Is someone here who can explain the difference?
Is this a repost? I swear I saw this exact same thing a few days ago when I binged the series. I even called out that you made a mistake when connecting the lines from the hidden layer to the output layer. How are there comments from 3 days ago but it says published June 30 (TODAY) ? Was there a glitch in The Matrix?! :P The reason I am back to this video is because it was in my notifications as a new video in the series, but it, obviously, is not.
It's because you saw this video while it was still unlisted. We make all the videos available as soon as we can to anybody who wants to watch the whole series, but we publish them roughly once a day to keep a steady flow of videos coming out. -MB
Typically, a single bias node is added for the input layer and every hidden layer in a feedforward network. You are right, it helps with dealing with zero inputs.
I have made a version with multiple hidden layers github.com/lomikstik/javascript-p5/blob/master/neuro-evolution-shiffman%2Blib/flappy_for_multi_layer_nn/libraries/nn_multi.js
This is beautiful. For the first time I got a clear understanding of the basics of Neural network. Thank you very much
Hi Daniel this video blew my mind ! Quick explanation : I'm 19 and I've been spending the last year learning linear algebra in maths lessons (vectorial spaces, matrix...) and for the first time, someone is giving me a practical use for all of this (very) theoric things, and it's furthermore related to my favourite subject !
Thank you a lot for all your work, it's awesome and it's enabling me to improve myself in IT, but also in english as I'm french :P
I want to say there are more common uses for matrix calculation which we (German) got teached in school. Most about equations which are related to real world problems.
Finding solutions for systems of linear equations, doing linear transformations to map between coordinate systems (this very channel has a series of videos where he used some linear algebra to get room scale point cloud data from a kinect sensor that basically outputs a matrix), electrical circuits, and I've even heard about using linear algebra to model real-life traffic flow. The way all this math is taught where I live (Brazil) is not optimal from the perspective of a student who just wants to know how to use it. To be honest, most of them won't anyway. But once you get a glimpse of how versatile this bit of maths is, everything just clicks in your mind and it's wonderful.
This channel should have a magnitude more subscribers than it does. Truly a great resource.
Im taking a deep learning course this semester and Im so glad these vids exist lol.
I can't believe I watched this for free!!!
Thank you very much!!
Dude, you're amazing as a teacher. First time ever I'd been able to properly understand this. Also super awesome video the one with the Markov Chains. Thank you for your work!!!
Amazing how the same topic teached by different persons can be either boring/ultra complicated as hell or can be so inspiring that you want to build something right away after the lesson. This guy falls immediately in the second category. It's like he can unlock something inside you which makes you understand something you never understood before.
This is really fascinating! I hear about things like this but I was never thinking I was going to have an understanding about how it works in my life! Now I'm starting to see that change way sooner than I would have ever thought it would! I don't really want to learn this from anyone else except you! This is probably one of the best ways to thoroughly teach and learn this!
keep the good work nice... finally someone that can explain this in the right way
These videos are awesome, Ive watched many other videos and i always ended up getting lost and basically just copying and pasting code and hoping it works right but with your videos I am actually understanding whats going on and am able to troubleshoot on my own if i do something wrong and figure out what i did incorrectly because I actually understand how its supposed to work. Great job!
good work!
i'm a beginner programer and you realy help me to be better!!
never stop make videos!
keep it up
I know he later corrects it one way, but at 17:30, if you want to keep the subscript of the weights as (row*col) while also keeping the row index as the input index, and the column index as the hidden layer index, you could do the transpose of the input matrix and multiply that (on the left) of the weight matrix. This would result in a 1 x n matrix for the output, but all the subscripts would work out in a way that's easier to understand
Honestly speaking. You are the best teacher I have ever had. Love from Pakistan. And keep this great stuff going on 👍
Dan I hope you get well soon ....... waiting for your return on the coding train!!!
Great video! I hope you spring back quick and well from your incident, my good sir!
Very descriptive explanation. Thank you for making my lectures easier.
I really enjoy your teaching style. Quirky, funny and informative.
thank you!
Great explanation of neural networks.Based on your lastly written code and Tariq Rashid code i've written my neural network library in c++ but with customized multi layers. I mean it's not great but it's working. Can't wait to see how the series continues.
Well explained ,after watching many videos I found this is the best video which gave me clear idea about NN
Hey sir u r the most wonderful teacher I ever got
Just Love Ur Classes
chup beti
are you going to eventually move onto evolving the overall topolgy (how the graph is connected) , for example the NEAT algorithim?
(note: you used * for scalar multiplication and x for matrix and vector notation, if you are planning to cover the cross product you should probably avoid using x for multidimensional products that are associative...uh I mean use it only for the cross product)
Você é fantástico! Estou aprendendo com suas explicações! Obrigado!!!!!
Love the energy and excitement, happy to have discovered you!
Following along and making a library for Java! Nice video!
Sorry to hear about you arm. I will pray to God for your fast recovery. Get we soon. ☺☺☺ brother.
Awesome explanation! This is how 'teaching' is done! :)
Thank you for these videos. Genuinely interesting and very well presented.
Hi! I loved your explanation! It's really clear and easy to understand, great teacher, thank you very much ☺️
This video is still good learning material today!! Thank you .
Hello, I just would like to say Thank You! After your videos I get it.
Also thank you for your clear pronunciation.
Thanks so much for explaining the concept so well!
Hi Daniel, get well soon and keep up the amazing work!
Hey Dan, first of all I love your content and your personal way of engaging with your viewers! You have a very natural and captivating appearance. I would like to ask you and the rest of the audience a question. I'm hopefully starting on software development in august at a university, and I have just been wondering: Could there be any interest for the programming community to follow a fresh programmer, because I have been thinking about starting a Blog or a RUclips channel, where I would talk about the things we learn and some of the challenges.
Is it interesting to learn with a new student or should I wait until I get more experienced?
If it is in any way interesting, how can I prepare to start?
Thank you, I've just started to study machine translation. Your videos help me to understand it more deeply. I haven't seen the next videos yet but I hope you will also make one in which you will use Python3 and that you'll explain more maths like the sigmoid fonction, softmax,.. :).
great video, very clearly explained, thank you !
Fall in Love With ML ❤️
This man is gold
Great Explanation. What about the bias that should be added with weight and input when you do forward propagation but you didn't mention it here.
this video is awesome thanks... for the first time I completely understand why we need linear algebra in neural networks xD
THANK YOU.
EXCELLENT .. YOU ARE AWESOME PROFESSOR.. KEEP LOADING YOUR VIDEOS.
You talk about the normalisation of inputs that are much larger than other inputs. Could this just be handled by adjustment of the weights to normalise the very large input?
I love you so much. You are so likable, so funny. You are awesome.
How would you manage "degrees of freedom" or is there a feasible solution problem; before you dump data into a neural net which will try to fit data regardless? Great vids!
You are the best really. You make it easy and i like MLP just because of you !
19:44 I got goosebumps when I heard Dan say 'Python'.
very amazing stuff and explanation i want to ask you will you do a list on convocation NN? it will help a lot
Thank you !
This is really helping me!
Man you just help a lot of people......Thanks😁
OMG. I thought that I will never understand it. But its was all about the teacher. Neural Networks is so easy. I think I will retake BBM406 Fund. of Mach. Learning lecture.. And I will write down the increase of my grade.
When I see 'and or truth table' at first I was like 'Really nigga, wtf Im not gonna watch that', but after I watched, I got the concept. It really helped me thanx a lot.. :P
Excellent !! Keep it up but could you please tell me that how we will come to know to choose how
many hidden neurons in hidden layers? I mean is there any formula?
You are just amazing ! When you explain it, everything seems simple ^^'
Daniel, you are an amazing guy e do an amazing job! Thank you so much!
Hope your arm gets better soon :)
Hi man, nice to see you here! I enjoy your CS:GO videos just as much as Dan's videos
Hey, is P5 support EcmaScript6? Specifically classes and inheritance. Because those are easier to teach junior students rather than objects in JSON format
11:46 You say you would need to normalize the inputs, is this really needed, as all the inputs are weighted. If the weight for house area is weighted by a really small number, lets say 0.00001. It would automatically get normalized, right?
Best tutorials on RUclips!!! Keep going ;p
This Video Really Has Helped Me!
Thumbs Up!
Isn't the house prediction linearly separable though?
what programing environment are you using? why arent you using processing anymore in the series?
Hi Daniel Schiffman! I have loads of questions from this video and several others I've watched of yours... I am 14, we were tought all of this linenear algebra last year in school, so i have no problem with that.I want to know why we use perceptrons and neural networks when we can do the same thing with linear and polynomial regression (I've tried it and it took very little time and processing power it also had a lower loss ).Also can we optimize the weights in a neural net using geneticic algorithms you described in a video series you have already made?
please whoever can answer these questions...I have no idea who to ask and I dont trust stack exchange (somehow they arent allowing me to post anything there).Thank you for making this video, it really helped me, so did all the others I've watched.They have pulled me out of video game addiction...(I think)
*ps (for every one who has had trouble with matrices) these videos on neural nets by you and fastai really help me ace my math tests...
I like to use simple scenarios that don't need neural networks to practice / learn about neural networks. So that's really the main reason here!
@@TheCodingTrain Hi again! I understood that from the video but I want to know is - in what kind of situation would a neural network work better than polynomial regression ( because in my eyes they seem the same ).I hope I'm not bothering you too much...
Also I am not making any reference to the videos content.
cheers, -my_youtube_username!
VERY GOOD
KEEP GOING
great video sir
The best!
Thank you so much!
Can you also create a perceptron using Python and PyTorch
Hello all, what I don't get is: what's the difference between a feed forward neural network trained with backpropagation and a MLP? Is someone here who can explain the difference?
thanks for making this video!
I'm not sure if you've ever introduced yourself? Anyway, keep up the good work. Thank you!
he introduces himself at the beginning of every livestream.
You are awesome so far!
Is this a repost? I swear I saw this exact same thing a few days ago when I binged the series. I even called out that you made a mistake when connecting the lines from the hidden layer to the output layer.
How are there comments from 3 days ago but it says published June 30 (TODAY) ?
Was there a glitch in The Matrix?! :P
The reason I am back to this video is because it was in my notifications as a new video in the series, but it, obviously, is not.
It's because you saw this video while it was still unlisted.
We make all the videos available as soon as we can to anybody who wants to watch the whole series, but we publish them roughly once a day to keep a steady flow of videos coming out. -MB
Okay, that makes sense! Thanks!
hey, i have a question, can you do this in processing?
If I want to schedule the games that language I need to study and schedule I need to download? Ps. I'm Italian boy so my English is bad
One genius guy.^
THEEEE best!
Can You Tell Us The Process You Use To Make A Project
keep it u
Your video helps a lot
Thank you >
what are you taking.................humphry davy's N2O (Nitrous Oxide). The Laughing Gas.
It was helpful
You should have gone for a football formation such as 4-4-2 or a 3-5-2
Is your Matrix powered by human beeings?
Bias is not require for Hidden Layers ? if we get all values are zero
Typically, a single bias node is added for the input layer and every hidden layer in a feedforward network. You are right, it helps with dealing with zero inputs.
when is the next live session for NN??(I think it is today but i m nt sure)
I was wondering the same thing. Can anybody clarify the actual schedule ?
wow, u r amazing
Hey Daniel how to modify when require multiple hidden layer .
It will it useful if you teach that Thank You :)
I have made a version with multiple hidden layers
github.com/lomikstik/javascript-p5/blob/master/neuro-evolution-shiffman%2Blib/flappy_for_multi_layer_nn/libraries/nn_multi.js
yeah! how to modify weights in a neural net...maybe he explains that later on
why don't make this course with Python? (Python for AI)!!
Nice work
hello dear how can get this software Neusciences Neuframe v4
Why cant you do this stuff with JAVA??? Please lemme know cause im trying to do NN coding with JAVA
You can find a Java port of this series here!
github.com/CodingTrain/Toy-Neural-Network-JS/blob/master/README.md#libraries-built-by-the-community
Good video
But you did a mistake at 18:22
it has to be h1=w11*x1+w12*x2
h2=w21*x1+w22*x2
Collab with siraj raval waiting to see you both in a video that would be a blast!
ruclips.net/video/ad7SUyN5V7A/видео.html
I wish you could do it in python
can't you write codes in python ?it will be more helpful if you can do so.
ECE 449
0:16:20 you are at 10.5
hi there. how's going? thank you for energetic description. Can I have your email for further discussion?
Too much good, but one thing I must say too much comprehensive, they should be in summarize way so listener can get more in less time.
why not stop now ?
Nice video! First 750 views :D
w_{12} and w_{21} should be switched. Edit: Ah, it was caught :-)
I didnt understand...