Essential Matrix Algebra for Neural Networks, Clearly Explained!!!

Поделиться
HTML-код
  • Опубликовано: 10 дек 2024

Комментарии • 163

  • @statquest
    @statquest  Год назад +9

    To learn more about Lightning: lightning.ai/
    Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/

    • @bigbacktor
      @bigbacktor 11 месяцев назад

      Hi, i would like to buy the book in color. Does someone know if that is posible ? It seems to me that in amazon is in black and white and on Lulu is on color. Is that right ? I am from spain.

    • @statquest
      @statquest  11 месяцев назад

      @@bigbacktor They are all in color. And there is a version that is translated into spanish if you are interested.

    • @bigbacktor
      @bigbacktor 11 месяцев назад +1

      @@statquest Thanks a lot!! BAM you just sold another book :)

    • @statquest
      @statquest  11 месяцев назад +1

      @@bigbacktor Hooray!!! Thank you very much for supporting StatQuest! BAM! :)

  • @kmart7117
    @kmart7117 2 месяца назад +8

    I did a linear algebra algebra heavy undergrad and masters degree. Nowhere in that experience did I get such a clear and succinct description of the transformation matrix and the reasoning behind why our order of operations for matrix multiplication is so idiosyncratic. Josh is a really amazing educator.

    • @statquest
      @statquest  2 месяца назад

      Thank you very much! :)

  • @백영래-u3x
    @백영래-u3x Год назад +19

    You have been staying the course for a long, long time. It's not so easy ! Keep up the good work!

  • @bin4ry_d3struct0r
    @bin4ry_d3struct0r Год назад +28

    All visual learners are blessed by the great Josh Starmer!

  • @chefmategpt-v3u
    @chefmategpt-v3u Год назад +16

    What you do is nothing short of a miracle! Immense gratitude

  • @ram-my6fl
    @ram-my6fl 4 месяца назад +2

    They say you need Linear algebra to learn Machine Learning.
    But in reality all you need is this video and some 2.3 extra concepts for the whole of machine learning
    Thank you StatGOD

    • @statquest
      @statquest  4 месяца назад +1

      I, too, am confused why so much emphasis is put on learning linear algebra as a pre-requisite for ml. I hardly ever use anything more complicated matrix multiplication and transposing matrices.

  • @davidlu1003
    @davidlu1003 28 дней назад +2

    Some people do not understand why I enjoy your courses, because you simply explained everything clearly in its original theory.😁😁😁

  • @kitkitmessi
    @kitkitmessi 8 месяцев назад +3

    this is freaking amazing! Would love to see more math lessons like this

  • @yurobert3007
    @yurobert3007 5 месяцев назад +1

    Crisp and clear explanation on matrix multiplication in the context of neural networks. Moreover, the quality of graphics/visual presentation is impeccable!

  • @Why_I_am_a_theist
    @Why_I_am_a_theist Год назад +7

    World is a better place with Josh🎉

  • @free_thinker4958
    @free_thinker4958 11 месяцев назад +7

    Could you please make a video about QLORA? ❤ You're our savior when it comes to understanding complex concepts, thank you man

    • @statquest
      @statquest  11 месяцев назад +1

      I'll keep that in mind.

    • @TheTruthOfAI
      @TheTruthOfAI 10 месяцев назад

      that would be lovely, perhaps LORA itself holds a strong glue potential across neural networks, will be looking forward for such amazing video

  • @arenashawn772
    @arenashawn772 11 месяцев назад +4

    I found this channel when searching for a clear explanation of central limit theorem on Google (after doing some simulation in R using sample size much less than 30 and being intrigued by the results I got) and I just want to say I love the content so much! (And the ukulele episode ❤) I’ve recently started some machine learning classes on coursera and EdX, and I must say the explanation you have here in these episodes are SO MUCH BETTER AND MORE TO THE POINT/BETTER DEFINED than the multi thousand dollar classes (I’m surely glad I chose to audit them first!) taught by professors from Harvard or Engineers working for Google/IBM. So much better!… ❤❤❤
    Just want to say thank you and Merry Christmas! I know I will be going through these videos one by one in the coming months…

    • @statquest
      @statquest  11 месяцев назад +1

      Thank you very much!!! I'm so happy you enjoy my videos. BAM! :)

    • @arenashawn772
      @arenashawn772 11 месяцев назад +1

      ⁠@@statquestI really did and binge watched a bunch… But I must say I now enjoy your songs even more 😂 Just bought all your albums on bandcamp - they are awesome! That going back to Cali song just had me rolling off my chairs at the end of it… I relocated from San Francisco Bay Area to Florida panhandle not long ago so that song really struck a cord with me 😂😂😂

    • @statquest
      @statquest  11 месяцев назад

      @@arenashawn772 Thank you very much! I'm glad you enjoy the tunes and the videos. I hope the move went well! :)

  • @abhinavchauhan1346
    @abhinavchauhan1346 Месяц назад +1

    "Those memories follow me around!!!"
    Nailed it Josh!😂😂

  • @ageofkz
    @ageofkz 9 месяцев назад +2

    Really amazing work! This set of videos (neural network playlist) has really helped me in my uni coursework and project! My groupmates and I are planning to get a statquest triple bam hoodie each haha!

    • @statquest
      @statquest  9 месяцев назад

      That's awesome!!! TRIPLE BAM! :)

  • @ritshpatidar
    @ritshpatidar Год назад +7

    I was thinking about taking a course to learn matrix algebra yesterday. Thanks for posting this video. It is really helpful and it is like a wish came true.

  • @varshinibalaji3535
    @varshinibalaji3535 11 месяцев назад +2

    Wow, just the perfect video I was looking for! Loved all the Taylor references, and music puns.

    • @statquest
      @statquest  11 месяцев назад

      Hooray!!! You're the first person to mention the Taylor references in a comment. BAM!!! :)

    • @varshinibalaji3535
      @varshinibalaji3535 11 месяцев назад +1

      @@statquest I was looking for it, since you mentioned something with Taylor was coming soon, in one of my Linkedin Posts :) Plus, I've been reading a lot of academic papers lately, So needed a better context on matrix transformations to interpret the math better! So, Double BAM, indeed!

  • @TheTruthOfAI
    @TheTruthOfAI 10 месяцев назад +2

    i love your videos, it helped me so much.. learned a lot.. i was able to make UNA thanks to your learnings :)

    • @statquest
      @statquest  10 месяцев назад +1

      Triple bam! Congratulations!

  • @haitematik5832
    @haitematik5832 10 месяцев назад +2

    Man u deserve a thousand times more subscribers

  • @ShadArfMohammed
    @ShadArfMohammed Год назад +4

    Baaam this is good :D I have been waiting for this, to be honest, I had the feeling that one day you would make such a tutorial. Your content is great.

  • @michaelzap8528
    @michaelzap8528 6 месяцев назад +1

    Best of the best. I'm really speechless now.

  • @anmolarora2599
    @anmolarora2599 9 месяцев назад +2

    Thank you for explaining it so simply even a novice like me can understand it.

  • @randr10
    @randr10 Год назад +3

    Thank you for this video. I think I understand what a transformer is now.

  • @ilirhajrullahu4083
    @ilirhajrullahu4083 Год назад +6

    Very nice video! Thank you for uploading such helpful material :). It would be great if you made a video on vector and matrix calculus. These are important topics in NNs too :).

    • @statquest
      @statquest  Год назад +1

      I'll keep that in mind.

    • @gocomputing8529
      @gocomputing8529 Год назад +1

      Thanks for the great video! Also the topic proposed here would also be super interesting, so I hope you could do it someday

  • @AyumiFortunex
    @AyumiFortunex 10 месяцев назад +2

    Absolutely fantastic explanation again

  • @Ivoshevo
    @Ivoshevo 2 месяца назад +1

    Thank you so much Mr StatQuest it was a big BAM! for me

  • @MohsenEedloo
    @MohsenEedloo 11 месяцев назад +4

    Hi Josh... would you please make a video and explain the differences between different statistical tests like t, z, chi... I want to know the differences and when to use each.

    • @statquest
      @statquest  11 месяцев назад

      I'll keep that in mind.

  • @BrianPondiGeoGeek
    @BrianPondiGeoGeek Год назад +3

    Tripple Bam for sure. Amazing explanation.

  • @techproductowner
    @techproductowner 9 месяцев назад +2

    You will be known and remembered for the next 1000 years ..

  • @kartikchaturvedi7868
    @kartikchaturvedi7868 Год назад +2

    Superrrb Awesome Fantastic video

  • @siddhanthbhattacharyya4206
    @siddhanthbhattacharyya4206 8 месяцев назад +2

    Quadruple bam! (One bam for me finally understanding)

  • @louisnemzer6801
    @louisnemzer6801 Год назад +6

    Squatch: So it's all just matrix multiplication?
    Josh: Always has been

  • @kujohjotaro3017
    @kujohjotaro3017 11 месяцев назад +2

    Your video is just a lifesaver to me and my essay! Could you make a video on the Glove model in NLP?

    • @statquest
      @statquest  11 месяцев назад

      I'll keep that in mind.

  • @beyondl6914
    @beyondl6914 4 месяца назад +1

    Squatchs too relatable🙏😭💯

  • @NJCLM
    @NJCLM Год назад +4

    Very good video ! You should remake one of the transformer videos with the matrix notation as you done at the end of this vide.

    • @statquest
      @statquest  Год назад +7

      I'm working on it right now. Hopefully it will be ready soon.

    • @NJCLM
      @NJCLM Год назад

      @@statquest take your time and thanks you very much, your content is so much valuable !

  • @amjadiqbal478
    @amjadiqbal478 8 месяцев назад +2

    Quite good. ❤

  • @pran441
    @pran441 11 месяцев назад +2

    Joshua your teaching was fantastic, but I couldn't quite grasp the concept.

    • @statquest
      @statquest  11 месяцев назад

      What time point (minutes and seconds) was confusing?

  • @slash_29
    @slash_29 Год назад +3

    Please part 2 with more details, and new terms

  • @SoukainaGhafel
    @SoukainaGhafel 8 месяцев назад +2

    thanks for ur effort, ur videos helped me so much, but could u plz tell us how lghm works

    • @statquest
      @statquest  8 месяцев назад

      Do you mean Light Gradient Boost? LightGBM?

    • @SoukainaGhafel
      @SoukainaGhafel 8 месяцев назад +1

      I mean LightGBM
      @@statquest

  • @deveshbhatt4063
    @deveshbhatt4063 Год назад +4

    Triple Bam🎉❤

  • @aga5979
    @aga5979 6 месяцев назад +1

    Could you do a series on "attention is all you need " paper ? Thank you Sir.

    • @statquest
      @statquest  6 месяцев назад +1

      This video walks you through the concepts in that paper: ruclips.net/video/zxQyTK8quyY/видео.html
      And this video goes through the math: ruclips.net/video/KphmOJnLAdI/видео.html

    • @aga5979
      @aga5979 6 месяцев назад +1

      @@statquest thank you so much!!

  • @ps3301
    @ps3301 Год назад +3

    Could you explain the math behind a basic liquid neuron and show how it differs from other neuron ?

  • @PavanKumar-pt2sh
    @PavanKumar-pt2sh Год назад +3

    Can you please create a video on multi-modal transformer architecture?

    • @statquest
      @statquest  Год назад +1

      I'll keep that in mind.

    • @free_thinker4958
      @free_thinker4958 11 месяцев назад +1

      I hope he does it, he's our savior when it comes to understanding complex concepts

  • @magtazeum4071
    @magtazeum4071 10 месяцев назад +1

    Hi Josh, Could you do video on time series clustering , and time series analysis please?

    • @statquest
      @statquest  10 месяцев назад

      I'll keep that in mind.

  • @Yutubnyajus
    @Yutubnyajus 10 месяцев назад +1

    Can u please discuss about stochastic gradient boosting for classification?. I'm having trouble understanding that 😢

    • @statquest
      @statquest  10 месяцев назад

      I have a whole series of videos on Gradient Boosting. You can find them here: statquest.org/video-index/

  • @shouvikdey7078
    @shouvikdey7078 11 месяцев назад +1

    Do more videos related to GAN etc.

    • @statquest
      @statquest  11 месяцев назад

      I'll keep that in mind.

  • @Ahmed_Issaoui
    @Ahmed_Issaoui 11 месяцев назад +2

    hello statquest, what software do you use to create your videos ?
    (your answer is really useful to me)

    • @statquest
      @statquest  11 месяцев назад

      I give away all of my secrets in this video: ruclips.net/video/crLXJG-EAhk/видео.html

  • @anonymousgirl1463
    @anonymousgirl1463 11 месяцев назад +1

    Hey Josh! I love your channel and I was thinking about buying a study guide. What is the difference between watching one of your playlists and buying a study guide? Do you cover exactly the same in both and buying the study guide is for support/like a donation or is there any difference?

    • @statquest
      @statquest  11 месяцев назад

      They are the same. The difference is that some people like to have the study guides for offline use or adding their own notes to. In some ways, the study guides are like "cheat sheets" - everything in a video is condensed to about 3 to 5 pages.

  • @nivcohen961
    @nivcohen961 9 месяцев назад +1

    You are awsome

  • @akankshaaggarwal394
    @akankshaaggarwal394 2 месяца назад +1

    I'm `Squatch. `Squatch is Happy!

  • @midhileshmomidi3120
    @midhileshmomidi3120 11 месяцев назад +1

    Can we book on these concepts as well

    • @statquest
      @statquest  11 месяцев назад

      I'm writing it right now.

  • @artofwrick
    @artofwrick 6 месяцев назад

    Should we move to kaggle as we learn from these videos? Like solving their exercises to get a feel of the real world machine learning in python to the competitions they host? Or is there anything in between?

    • @statquest
      @statquest  6 месяцев назад

      You can definitely take a stab at a Kaggle dataset see how things go. However, I also have some videos that walk you through how to do real-world data analyses here: ruclips.net/video/GrJP9FLV3FE/видео.html and here ruclips.net/video/8A7L0GsBiLQ/видео.html

  • @trinitywarlord
    @trinitywarlord 5 месяцев назад +1

    Josh starmer is a swiftie!

  • @vigneshvicky6720
    @vigneshvicky6720 Год назад +11

    We want yolo series mainly yolov8 from scratch

    • @statquest
      @statquest  Год назад +2

      I'll keep that in mind.

    • @anickkhan
      @anickkhan 11 месяцев назад +2

      Please Professor, it’s an earnest request. Lots of Love from Bangladesh ❤❤

  • @davidmurphy563
    @davidmurphy563 Год назад +3

    Ok, it always annoyed me that when you're doing matrix vector (col) multiplication they always write the matrix first, then the vector. It never occured to me until you said so just now that the cols and rows aren't valid tensor operations if you write them the other way round... Doh! It doesn't look nice though.
    Btw, why did you use a row vector and a transverse matrix? I would always use a col vector. Col space transforms are the default for me and you can picture the latent space.
    The only times I'd use rows is if I have a system of linear equations.

    • @statquest
      @statquest  Год назад +1

      I agree that the matrix * column looks bad. And I chose to do row * matrix because that is what they used in the PyTorch documentation.

    • @davidmurphy563
      @davidmurphy563 Год назад +1

      @@statquestGlad it's not just me that thinks it looks backwards! :)) But you're of course right; 2x2 * 2x1 is a valid operation whereas 2x1 * 2x2 is, strictly speaking, undefined.
      Oh, a tip you may (or may not!) find a useful teaching tool:
      I always look at matrix multiplication in terms of a series of dot product operations. Once the student understands that the dot product outputs a scalar expressing the likeness of two vectors (eg whether two normalised vectors pointing the same way) then rather than just mechanically running an algorithm - the student can see that it's plotting the vector in the new space by comparing its likeness to the space's basis vectors one axis at a time. That's why I think it's always handy to see a square matrix as a series of basis vectors.
      So, if you're going from an orthonormal basis to one where, say, y is mirrored - {{1, 0}, {0, -1}} - then it's quite apparent why taking the dot product for each spatial dimension will plot the vector upside-down. You could show an image flipping to drive the point home.
      I just think that's intuitive and why we're multiplying and adding across columns and rows.
      At least that's how I like to see it.

  • @ivant_true
    @ivant_true Год назад +2

    great

  • @jarsal_firahel
    @jarsal_firahel 8 месяцев назад

    What about a video on MAMBA architecture ? That would be really BAAAM

    • @statquest
      @statquest  8 месяцев назад +1

      I'll keep that in mind.

  • @Nono-de3zi
    @Nono-de3zi Год назад

    Thanks Josh. But naughty, naughty, the stage is not just rotating, it is flipping. Which you can also encode in matrices of course ;-)

    • @statquest
      @statquest  Год назад

      I'm not sure I understand what you mean by flipping in addition to rotating as stage left and stage right are maintained through out each change.

    • @Nono-de3zi
      @Nono-de3zi Год назад

      @@statquest The drawing of the stage is asymmetrical (one edge is slightly erased). When you did the slides you flipped it instead of rotating it. As a result, Statsquatch is sometimes on one side, sometimes on the other. I know it was not on purpose 🙂 Thanks for the excellent vid as usual.

    • @statquest
      @statquest  Год назад

      @@Nono-de3zi I'm still confused because statsquach is always on stage left.

    • @Nono-de3zi
      @Nono-de3zi Год назад

      But the *stage* is flipped :-)

    • @statquest
      @statquest  Год назад

      @@Nono-de3zi If it was flipped, then wouldn't stage left stay on top and stage right stay on the bottom?

  • @nutzeeer
    @nutzeeer Год назад

    14:05 matrix multiplication cant be rearranged, as matrix multiplication is a sequence of calculations. is this indicated by using X as a multillication symbol and not •? Becaus in school we used • to indicate multiplications.

    • @nutzeeer
      @nutzeeer Год назад

      ah no the x is not signifying order. but I would like that to be visible from writing alone, without the helpful explanation.

    • @nutzeeer
      @nutzeeer Год назад

      i wonder why matrices are turned sideways like that. it would feel easier for me to multiply rows with rows.

    • @statquest
      @statquest  Год назад +1

      This is explained, although I'm guessing not to your satisfaction, at 10:58. It has to do with the ability to combine transformations. For more details, see: math.stackexchange.com/questions/271927/why-historically-do-we-multiply-matrices-as-we-do

  • @harryliu1005
    @harryliu1005 8 месяцев назад +2

    after 10000000 years, scientists found fossil record of statquest, then he said " BAM!"

    • @statquest
      @statquest  8 месяцев назад

      Ha! You made me laugh.

  • @yyyzzz-k3r
    @yyyzzz-k3r Год назад

    Great video, but i don't quite understand 25:25...

    • @statquest
      @statquest  Год назад

      It just means that PyTorch stores the weights differently than we used in the earlier examples and in order to get the same math, we have to transpose the PyTorch weight matrix.

  • @lifeisbeautifu1
    @lifeisbeautifu1 9 месяцев назад +1

    BAM!

  • @bossgd100
    @bossgd100 Год назад +1

    Woaw

  • @Jagentic
    @Jagentic Месяц назад

  • @mahdi.ahmadi.2
    @mahdi.ahmadi.2 Месяц назад

    +1.6 ?! Is it correct?

    • @statquest
      @statquest  Месяц назад

      What time point in the video, minutes and seconds, are you asking about?

    • @mahdi.ahmadi.2
      @mahdi.ahmadi.2 Месяц назад

      @@statquest 21:34

    • @statquest
      @statquest  Месяц назад +1

      @@mahdi.ahmadi.2 The video is correct. 1.6 is the bias value that we add to -1.0. This gives us a sum of 0.6, as seen at 21:45

    • @mahdi.ahmadi.2
      @mahdi.ahmadi.2 Месяц назад

      @@statquest Actually, my main question is where the Bias numbers +1.6 and 0.7 come from?

    • @statquest
      @statquest  Месяц назад +1

      @@mahdi.ahmadi.2 The weights and biases for all neural networks are obtained with backpropagation. For details on how that works, see: ruclips.net/video/IN2XmBhILt4/видео.html

  • @davidlu1003
    @davidlu1003 2 месяца назад +1

    This is the easiest chapter for me.😆😆😆

  • @werewolfprogrammer
    @werewolfprogrammer 11 месяцев назад

    Hi, I am trying to start a youtube channel to make tutorial videos about data science related topics. I want to make the videos about things that are less popular but still important, since I found that it can be quite difficult to start off with these things since most information is in difficult to comprehend papers. My starting point will be social network analysis and natural language processing as that is my main interest and expertise. However, I am interested in finding more topics so I am starting by doing research on different channels that make tutorials for data science, AI, machine learning, statistics, natural language processing, graph theory or network analysis.
    So for anybody in the comments that reads this message, could you help me out by replying with any youtube creators that do something related to these topics, or any other digital platform like Brilliant. If you know a topic that is similar to the ones I mentioned that would also be a great thing to share. Or if you know of better places to share this message. Or any other helpfull tips.
    Thanks everybody for the help. If this message is regarded as spam also please say so and I will remove it.
    The topics again:
    -data science
    -AI
    -machine learning
    -statistics
    -natural language processing
    -graph theory
    -network analysis

  • @davidlu1003
    @davidlu1003 2 месяца назад

    I know I get a website to learn machine learning by myself.😁😁😁

  • @tsunningwah3471
    @tsunningwah3471 11 месяцев назад

    zhina!

  • @samiotmani9092
    @samiotmani9092 9 месяцев назад +3

    97 videos finished … small bam 🥲