10.5: Neural Networks: Multilayer Perceptron Part 2 - The Nature of Code

Поделиться
HTML-код
  • Опубликовано: 22 окт 2024

Комментарии • 132

  • @KayYesYouTuber
    @KayYesYouTuber 5 лет назад +39

    This is beautiful. For the first time I got a clear understanding of the basics of Neural network. Thank you very much

  • @pchol5972
    @pchol5972 7 лет назад +65

    Hi Daniel this video blew my mind ! Quick explanation : I'm 19 and I've been spending the last year learning linear algebra in maths lessons (vectorial spaces, matrix...) and for the first time, someone is giving me a practical use for all of this (very) theoric things, and it's furthermore related to my favourite subject !
    Thank you a lot for all your work, it's awesome and it's enabling me to improve myself in IT, but also in english as I'm french :P

    • @Ben-up4lj
      @Ben-up4lj 4 года назад +2

      I want to say there are more common uses for matrix calculation which we (German) got teached in school. Most about equations which are related to real world problems.

    • @guilhermetorresj
      @guilhermetorresj 3 года назад +2

      Finding solutions for systems of linear equations, doing linear transformations to map between coordinate systems (this very channel has a series of videos where he used some linear algebra to get room scale point cloud data from a kinect sensor that basically outputs a matrix), electrical circuits, and I've even heard about using linear algebra to model real-life traffic flow. The way all this math is taught where I live (Brazil) is not optimal from the perspective of a student who just wants to know how to use it. To be honest, most of them won't anyway. But once you get a glimpse of how versatile this bit of maths is, everything just clicks in your mind and it's wonderful.

  • @trickstur5994
    @trickstur5994 Год назад +2

    This channel should have a magnitude more subscribers than it does. Truly a great resource.

  • @EntwinedGraces
    @EntwinedGraces 7 лет назад +7

    Im taking a deep learning course this semester and Im so glad these vids exist lol.

  • @Chuukwudi
    @Chuukwudi 3 года назад +8

    I can't believe I watched this for free!!!
    Thank you very much!!

  • @FabianMendez
    @FabianMendez 7 лет назад +11

    Dude, you're amazing as a teacher. First time ever I'd been able to properly understand this. Also super awesome video the one with the Markov Chains. Thank you for your work!!!

    • @Qual_
      @Qual_ 5 лет назад +1

      Amazing how the same topic teached by different persons can be either boring/ultra complicated as hell or can be so inspiring that you want to build something right away after the lesson. This guy falls immediately in the second category. It's like he can unlock something inside you which makes you understand something you never understood before.

  • @zinsy23
    @zinsy23 4 года назад +1

    This is really fascinating! I hear about things like this but I was never thinking I was going to have an understanding about how it works in my life! Now I'm starting to see that change way sooner than I would have ever thought it would! I don't really want to learn this from anyone else except you! This is probably one of the best ways to thoroughly teach and learn this!

  • @BajoMundoUnderground
    @BajoMundoUnderground 7 лет назад +49

    keep the good work nice... finally someone that can explain this in the right way

  • @elocore1702
    @elocore1702 6 лет назад +3

    These videos are awesome, Ive watched many other videos and i always ended up getting lost and basically just copying and pasting code and hoping it works right but with your videos I am actually understanding whats going on and am able to troubleshoot on my own if i do something wrong and figure out what i did incorrectly because I actually understand how its supposed to work. Great job!

  • @alonattar3836
    @alonattar3836 7 лет назад +8

    good work!
    i'm a beginner programer and you realy help me to be better!!
    never stop make videos!
    keep it up

  • @cameronnichols9905
    @cameronnichols9905 4 года назад +2

    I know he later corrects it one way, but at 17:30, if you want to keep the subscript of the weights as (row*col) while also keeping the row index as the input index, and the column index as the hidden layer index, you could do the transpose of the input matrix and multiply that (on the left) of the weight matrix. This would result in a 1 x n matrix for the output, but all the subscripts would work out in a way that's easier to understand

  • @adeelfitness4993
    @adeelfitness4993 4 года назад

    Honestly speaking. You are the best teacher I have ever had. Love from Pakistan. And keep this great stuff going on 👍

  • @miteshsharma3106
    @miteshsharma3106 7 лет назад +1

    Dan I hope you get well soon ....... waiting for your return on the coding train!!!

  • @DustinTWilliams
    @DustinTWilliams 7 лет назад +1

    Great video! I hope you spring back quick and well from your incident, my good sir!

  • @shrikanthsingh8243
    @shrikanthsingh8243 5 лет назад +1

    Very descriptive explanation. Thank you for making my lectures easier.

  • @devinvenable4587
    @devinvenable4587 7 лет назад

    I really enjoy your teaching style. Quirky, funny and informative.

  • @EscapeMinecraft
    @EscapeMinecraft 7 лет назад

    Great explanation of neural networks.Based on your lastly written code and Tariq Rashid code i've written my neural network library in c++ but with customized multi layers. I mean it's not great but it's working. Can't wait to see how the series continues.

  • @VijayChethanSFCD
    @VijayChethanSFCD 4 года назад

    Well explained ,after watching many videos I found this is the best video which gave me clear idea about NN

  • @anuragdixit7107
    @anuragdixit7107 6 лет назад +3

    Hey sir u r the most wonderful teacher I ever got
    Just Love Ur Classes

  • @DaleIsWigging
    @DaleIsWigging 7 лет назад +7

    are you going to eventually move onto evolving the overall topolgy (how the graph is connected) , for example the NEAT algorithim?
    (note: you used * for scalar multiplication and x for matrix and vector notation, if you are planning to cover the cross product you should probably avoid using x for multidimensional products that are associative...uh I mean use it only for the cross product)

  • @omicron296
    @omicron296 Год назад

    Você é fantástico! Estou aprendendo com suas explicações! Obrigado!!!!!

  • @hr4735
    @hr4735 3 года назад

    Love the energy and excitement, happy to have discovered you!

  • @zumamusic5246
    @zumamusic5246 3 года назад

    Following along and making a library for Java! Nice video!

  • @osmanmustafaquddusi318
    @osmanmustafaquddusi318 7 лет назад

    Sorry to hear about you arm. I will pray to God for your fast recovery. Get we soon. ☺☺☺ brother.

  • @HeduAI
    @HeduAI 5 лет назад +2

    Awesome explanation! This is how 'teaching' is done! :)

  • @cristianrgreco
    @cristianrgreco 7 лет назад +1

    Thank you for these videos. Genuinely interesting and very well presented.

  • @marianagonzales3201
    @marianagonzales3201 4 года назад +1

    Hi! I loved your explanation! It's really clear and easy to understand, great teacher, thank you very much ☺️

  • @mademensociety5210
    @mademensociety5210 4 года назад +1

    This video is still good learning material today!! Thank you .

  • @lutzruhmann7162
    @lutzruhmann7162 5 лет назад +1

    Hello, I just would like to say Thank You! After your videos I get it.

  • @pantepember
    @pantepember 4 года назад

    Also thank you for your clear pronunciation.

  • @surobhilahiri
    @surobhilahiri 5 лет назад +1

    Thanks so much for explaining the concept so well!

  • @mightyleguan1451
    @mightyleguan1451 7 лет назад

    Hi Daniel, get well soon and keep up the amazing work!

  • @MrVildy
    @MrVildy 7 лет назад

    Hey Dan, first of all I love your content and your personal way of engaging with your viewers! You have a very natural and captivating appearance. I would like to ask you and the rest of the audience a question. I'm hopefully starting on software development in august at a university, and I have just been wondering: Could there be any interest for the programming community to follow a fresh programmer, because I have been thinking about starting a Blog or a RUclips channel, where I would talk about the things we learn and some of the challenges.
    Is it interesting to learn with a new student or should I wait until I get more experienced?
    If it is in any way interesting, how can I prepare to start?

  • @CocoChanel1313
    @CocoChanel1313 6 лет назад

    Thank you, I've just started to study machine translation. Your videos help me to understand it more deeply. I haven't seen the next videos yet but I hope you will also make one in which you will use Python3 and that you'll explain more maths like the sigmoid fonction, softmax,.. :).

  • @youssefnim
    @youssefnim Год назад

    great video, very clearly explained, thank you !

  • @ParitoshBaronVLOGS
    @ParitoshBaronVLOGS 5 лет назад +4

    Fall in Love With ML ❤️

  • @climito
    @climito Год назад

    This man is gold

  • @gajuahmed4426
    @gajuahmed4426 3 года назад

    Great Explanation. What about the bias that should be added with weight and input when you do forward propagation but you didn't mention it here.

  • @alirezanet
    @alirezanet 5 лет назад +1

    this video is awesome thanks... for the first time I completely understand why we need linear algebra in neural networks xD

  • @thevfxwizard7758
    @thevfxwizard7758 6 лет назад +5

    THANK YOU.

  • @suvranjansanyal2717
    @suvranjansanyal2717 7 лет назад

    EXCELLENT .. YOU ARE AWESOME PROFESSOR.. KEEP LOADING YOUR VIDEOS.

  • @georgechristoforou991
    @georgechristoforou991 5 лет назад

    You talk about the normalisation of inputs that are much larger than other inputs. Could this just be handled by adjustment of the weights to normalise the very large input?

  • @taihatranduc8613
    @taihatranduc8613 4 года назад

    I love you so much. You are so likable, so funny. You are awesome.

  • @MorneBooysen
    @MorneBooysen 6 лет назад

    How would you manage "degrees of freedom" or is there a feasible solution problem; before you dump data into a neural net which will try to fit data regardless? Great vids!

  • @majdoubwided6666
    @majdoubwided6666 4 года назад

    You are the best really. You make it easy and i like MLP just because of you !

  • @MarcelRuland
    @MarcelRuland 7 лет назад

    19:44 I got goosebumps when I heard Dan say 'Python'.

  • @luqmanahmad3153
    @luqmanahmad3153 5 лет назад +1

    very amazing stuff and explanation i want to ask you will you do a list on convocation NN? it will help a lot

  • @Cnys100
    @Cnys100 3 года назад

    Thank you !
    This is really helping me!

  • @ajiththiyar7609
    @ajiththiyar7609 6 лет назад

    Man you just help a lot of people......Thanks😁

  • @Gentleman217
    @Gentleman217 4 года назад

    OMG. I thought that I will never understand it. But its was all about the teacher. Neural Networks is so easy. I think I will retake BBM406 Fund. of Mach. Learning lecture.. And I will write down the increase of my grade.
    When I see 'and or truth table' at first I was like 'Really nigga, wtf Im not gonna watch that', but after I watched, I got the concept. It really helped me thanx a lot.. :P

  • @smrazaabidi1495
    @smrazaabidi1495 7 лет назад

    Excellent !! Keep it up but could you please tell me that how we will come to know to choose how
    many hidden neurons in hidden layers? I mean is there any formula?

  • @Lionel_Ding
    @Lionel_Ding 5 лет назад +1

    You are just amazing ! When you explain it, everything seems simple ^^'

  • @raimetm
    @raimetm 3 года назад

    Daniel, you are an amazing guy e do an amazing job! Thank you so much!

  • @dinoswarleafs
    @dinoswarleafs 7 лет назад +9

    Hope your arm gets better soon :)

    • @mightyleguan1451
      @mightyleguan1451 7 лет назад +4

      Hi man, nice to see you here! I enjoy your CS:GO videos just as much as Dan's videos

  • @MJ-in9wt
    @MJ-in9wt 7 лет назад

    Hey, is P5 support EcmaScript6? Specifically classes and inheritance. Because those are easier to teach junior students rather than objects in JSON format

  • @viktorstrate
    @viktorstrate 7 лет назад

    11:46 You say you would need to normalize the inputs, is this really needed, as all the inputs are weighted. If the weight for house area is weighted by a really small number, lets say 0.00001. It would automatically get normalized, right?

  • @armandasbarkauskas4485
    @armandasbarkauskas4485 5 лет назад +1

    Best tutorials on RUclips!!! Keep going ;p

  • @ilaapattu9845
    @ilaapattu9845 7 лет назад

    This Video Really Has Helped Me!
    Thumbs Up!

  • @patrik8641
    @patrik8641 3 года назад

    Isn't the house prediction linearly separable though?

  • @xrayer4412
    @xrayer4412 5 лет назад

    what programing environment are you using? why arent you using processing anymore in the series?

  • @grapheggspecies8907
    @grapheggspecies8907 5 лет назад

    Hi Daniel Schiffman! I have loads of questions from this video and several others I've watched of yours... I am 14, we were tought all of this linenear algebra last year in school, so i have no problem with that.I want to know why we use perceptrons and neural networks when we can do the same thing with linear and polynomial regression (I've tried it and it took very little time and processing power it also had a lower loss ).Also can we optimize the weights in a neural net using geneticic algorithms you described in a video series you have already made?
    please whoever can answer these questions...I have no idea who to ask and I dont trust stack exchange (somehow they arent allowing me to post anything there).Thank you for making this video, it really helped me, so did all the others I've watched.They have pulled me out of video game addiction...(I think)
    *ps (for every one who has had trouble with matrices) these videos on neural nets by you and fastai really help me ace my math tests...

    • @TheCodingTrain
      @TheCodingTrain  5 лет назад

      I like to use simple scenarios that don't need neural networks to practice / learn about neural networks. So that's really the main reason here!

    • @grapheggspecies8907
      @grapheggspecies8907 5 лет назад

      @@TheCodingTrain Hi again! I understood that from the video but I want to know is - in what kind of situation would a neural network work better than polynomial regression ( because in my eyes they seem the same ).I hope I'm not bothering you too much...
      Also I am not making any reference to the videos content.
      cheers, -my_youtube_username!

  • @ahmedel-asasey1982
    @ahmedel-asasey1982 7 лет назад +1

    VERY GOOD
    KEEP GOING

  • @inbasasis8221
    @inbasasis8221 6 лет назад

    great video sir

  • @peter_dockit
    @peter_dockit Год назад

    The best!

  • @StandardName562
    @StandardName562 4 года назад +1

    Thank you so much!

  • @amc8437
    @amc8437 3 года назад

    Can you also create a perceptron using Python and PyTorch

  • @davederdudigedude
    @davederdudigedude 7 лет назад

    Hello all, what I don't get is: what's the difference between a feed forward neural network trained with backpropagation and a MLP? Is someone here who can explain the difference?

  • @OmkaarMuley
    @OmkaarMuley 7 лет назад

    thanks for making this video!

  • @tw7522
    @tw7522 7 лет назад

    I'm not sure if you've ever introduced yourself? Anyway, keep up the good work. Thank you!

    • @fvcalderan
      @fvcalderan 7 лет назад

      he introduces himself at the beginning of every livestream.

  • @anuragghosh1139
    @anuragghosh1139 6 лет назад

    You are awesome so far!

  • @ThomasLe
    @ThomasLe 7 лет назад

    Is this a repost? I swear I saw this exact same thing a few days ago when I binged the series. I even called out that you made a mistake when connecting the lines from the hidden layer to the output layer.
    How are there comments from 3 days ago but it says published June 30 (TODAY) ?
    Was there a glitch in The Matrix?! :P
    The reason I am back to this video is because it was in my notifications as a new video in the series, but it, obviously, is not.

    • @TheCodingTrain
      @TheCodingTrain  7 лет назад +2

      It's because you saw this video while it was still unlisted.
      We make all the videos available as soon as we can to anybody who wants to watch the whole series, but we publish them roughly once a day to keep a steady flow of videos coming out. -MB

    • @ThomasLe
      @ThomasLe 7 лет назад

      Okay, that makes sense! Thanks!

  • @xxsunmanxx5430
    @xxsunmanxx5430 6 лет назад

    hey, i have a question, can you do this in processing?

  • @marcolinyi
    @marcolinyi 7 лет назад

    If I want to schedule the games that language I need to study and schedule I need to download? Ps. I'm Italian boy so my English is bad

  • @modernmirza5303
    @modernmirza5303 4 года назад

    One genius guy.^

  • @sololife9403
    @sololife9403 3 года назад

    THEEEE best!

  • @michaellewis8211
    @michaellewis8211 6 лет назад

    Can You Tell Us The Process You Use To Make A Project

  • @wawied7881
    @wawied7881 7 лет назад +1

    keep it u

  • @Jabh88073
    @Jabh88073 5 лет назад +1

    Your video helps a lot
    Thank you >

  • @siddhantkumar7257
    @siddhantkumar7257 6 лет назад

    what are you taking.................humphry davy's N2O (Nitrous Oxide). The Laughing Gas.

  • @sukanyabasu7090
    @sukanyabasu7090 5 лет назад

    It was helpful

  • @georgechristoforou991
    @georgechristoforou991 5 лет назад +1

    You should have gone for a football formation such as 4-4-2 or a 3-5-2

  • @corrompido7680
    @corrompido7680 7 лет назад

    Is your Matrix powered by human beeings?

  • @ranjithkumarpranjithkumar
    @ranjithkumarpranjithkumar 7 лет назад

    Bias is not require for Hidden Layers ? if we get all values are zero

    • @ItsGlizda
      @ItsGlizda 7 лет назад +2

      Typically, a single bias node is added for the input layer and every hidden layer in a feedforward network. You are right, it helps with dealing with zero inputs.

  • @abhishekjadav2371
    @abhishekjadav2371 7 лет назад

    when is the next live session for NN??(I think it is today but i m nt sure)

    • @marufhasan9365
      @marufhasan9365 7 лет назад

      I was wondering the same thing. Can anybody clarify the actual schedule ?

  • @algeria7527
    @algeria7527 7 лет назад

    wow, u r amazing

  • @mohan_manju
    @mohan_manju 7 лет назад

    Hey Daniel how to modify when require multiple hidden layer .
    It will it useful if you teach that Thank You :)

    • @datdemlomikstik4443
      @datdemlomikstik4443 7 лет назад

      I have made a version with multiple hidden layers
      github.com/lomikstik/javascript-p5/blob/master/neuro-evolution-shiffman%2Blib/flappy_for_multi_layer_nn/libraries/nn_multi.js

    • @grapheggspecies8907
      @grapheggspecies8907 5 лет назад

      yeah! how to modify weights in a neural net...maybe he explains that later on

  • @camjansen5025
    @camjansen5025 4 года назад

    why don't make this course with Python? (Python for AI)!!
    Nice work

  • @equarbeyra5501
    @equarbeyra5501 4 года назад

    hello dear how can get this software Neusciences Neuframe v4

  • @chancenigel0186
    @chancenigel0186 5 лет назад

    Why cant you do this stuff with JAVA??? Please lemme know cause im trying to do NN coding with JAVA

    • @TheCodingTrain
      @TheCodingTrain  5 лет назад

      You can find a Java port of this series here!
      github.com/CodingTrain/Toy-Neural-Network-JS/blob/master/README.md#libraries-built-by-the-community

  • @AHuMycK
    @AHuMycK 6 лет назад

    Good video
    But you did a mistake at 18:22
    it has to be h1=w11*x1+w12*x2
    h2=w21*x1+w22*x2

  • @srikanthshankar8871
    @srikanthshankar8871 7 лет назад

    Collab with siraj raval waiting to see you both in a video that would be a blast!

    • @0xTim
      @0xTim 7 лет назад

      ruclips.net/video/ad7SUyN5V7A/видео.html

  • @nkosanamabuza109
    @nkosanamabuza109 3 года назад

    I wish you could do it in python

  • @pyclassy
    @pyclassy 4 года назад

    can't you write codes in python ?it will be more helpful if you can do so.

  • @demetriusdemarcusbartholom8063
    @demetriusdemarcusbartholom8063 2 года назад

    ECE 449

  • @Ikpoppy
    @Ikpoppy 6 лет назад

    0:16:20 you are at 10.5

  • @arezhomayounnejad8399
    @arezhomayounnejad8399 3 года назад

    hi there. how's going? thank you for energetic description. Can I have your email for further discussion?

  • @waleedazam6916
    @waleedazam6916 6 лет назад

    Too much good, but one thing I must say too much comprehensive, they should be in summarize way so listener can get more in less time.

  • @hjjol9361
    @hjjol9361 6 лет назад

    why not stop now ?

  • @seanliu5582
    @seanliu5582 7 лет назад

    Nice video! First 750 views :D

  • @kristianwichmann9996
    @kristianwichmann9996 7 лет назад

    w_{12} and w_{21} should be switched. Edit: Ah, it was caught :-)