To learn more about Lightning: lightning.ai/ Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/
Hi, i would like to buy the book in color. Does someone know if that is posible ? It seems to me that in amazon is in black and white and on Lulu is on color. Is that right ? I am from spain.
I did a linear algebra algebra heavy undergrad and masters degree. Nowhere in that experience did I get such a clear and succinct description of the transformation matrix and the reasoning behind why our order of operations for matrix multiplication is so idiosyncratic. Josh is a really amazing educator.
They say you need Linear algebra to learn Machine Learning. But in reality all you need is this video and some 2.3 extra concepts for the whole of machine learning Thank you StatGOD
I, too, am confused why so much emphasis is put on learning linear algebra as a pre-requisite for ml. I hardly ever use anything more complicated matrix multiplication and transposing matrices.
Crisp and clear explanation on matrix multiplication in the context of neural networks. Moreover, the quality of graphics/visual presentation is impeccable!
I found this channel when searching for a clear explanation of central limit theorem on Google (after doing some simulation in R using sample size much less than 30 and being intrigued by the results I got) and I just want to say I love the content so much! (And the ukulele episode ❤) I’ve recently started some machine learning classes on coursera and EdX, and I must say the explanation you have here in these episodes are SO MUCH BETTER AND MORE TO THE POINT/BETTER DEFINED than the multi thousand dollar classes (I’m surely glad I chose to audit them first!) taught by professors from Harvard or Engineers working for Google/IBM. So much better!… ❤❤❤ Just want to say thank you and Merry Christmas! I know I will be going through these videos one by one in the coming months…
@@statquestI really did and binge watched a bunch… But I must say I now enjoy your songs even more 😂 Just bought all your albums on bandcamp - they are awesome! That going back to Cali song just had me rolling off my chairs at the end of it… I relocated from San Francisco Bay Area to Florida panhandle not long ago so that song really struck a cord with me 😂😂😂
Really amazing work! This set of videos (neural network playlist) has really helped me in my uni coursework and project! My groupmates and I are planning to get a statquest triple bam hoodie each haha!
I was thinking about taking a course to learn matrix algebra yesterday. Thanks for posting this video. It is really helpful and it is like a wish came true.
@@statquest I was looking for it, since you mentioned something with Taylor was coming soon, in one of my Linkedin Posts :) Plus, I've been reading a lot of academic papers lately, So needed a better context on matrix transformations to interpret the math better! So, Double BAM, indeed!
Baaam this is good :D I have been waiting for this, to be honest, I had the feeling that one day you would make such a tutorial. Your content is great.
Very nice video! Thank you for uploading such helpful material :). It would be great if you made a video on vector and matrix calculus. These are important topics in NNs too :).
Hi Josh... would you please make a video and explain the differences between different statistical tests like t, z, chi... I want to know the differences and when to use each.
This video walks you through the concepts in that paper: ruclips.net/video/zxQyTK8quyY/видео.html And this video goes through the math: ruclips.net/video/KphmOJnLAdI/видео.html
Hey Josh! I love your channel and I was thinking about buying a study guide. What is the difference between watching one of your playlists and buying a study guide? Do you cover exactly the same in both and buying the study guide is for support/like a donation or is there any difference?
They are the same. The difference is that some people like to have the study guides for offline use or adding their own notes to. In some ways, the study guides are like "cheat sheets" - everything in a video is condensed to about 3 to 5 pages.
Should we move to kaggle as we learn from these videos? Like solving their exercises to get a feel of the real world machine learning in python to the competitions they host? Or is there anything in between?
You can definitely take a stab at a Kaggle dataset see how things go. However, I also have some videos that walk you through how to do real-world data analyses here: ruclips.net/video/GrJP9FLV3FE/видео.html and here ruclips.net/video/8A7L0GsBiLQ/видео.html
Ok, it always annoyed me that when you're doing matrix vector (col) multiplication they always write the matrix first, then the vector. It never occured to me until you said so just now that the cols and rows aren't valid tensor operations if you write them the other way round... Doh! It doesn't look nice though. Btw, why did you use a row vector and a transverse matrix? I would always use a col vector. Col space transforms are the default for me and you can picture the latent space. The only times I'd use rows is if I have a system of linear equations.
@@statquestGlad it's not just me that thinks it looks backwards! :)) But you're of course right; 2x2 * 2x1 is a valid operation whereas 2x1 * 2x2 is, strictly speaking, undefined. Oh, a tip you may (or may not!) find a useful teaching tool: I always look at matrix multiplication in terms of a series of dot product operations. Once the student understands that the dot product outputs a scalar expressing the likeness of two vectors (eg whether two normalised vectors pointing the same way) then rather than just mechanically running an algorithm - the student can see that it's plotting the vector in the new space by comparing its likeness to the space's basis vectors one axis at a time. That's why I think it's always handy to see a square matrix as a series of basis vectors. So, if you're going from an orthonormal basis to one where, say, y is mirrored - {{1, 0}, {0, -1}} - then it's quite apparent why taking the dot product for each spatial dimension will plot the vector upside-down. You could show an image flipping to drive the point home. I just think that's intuitive and why we're multiplying and adding across columns and rows. At least that's how I like to see it.
@@statquest The drawing of the stage is asymmetrical (one edge is slightly erased). When you did the slides you flipped it instead of rotating it. As a result, Statsquatch is sometimes on one side, sometimes on the other. I know it was not on purpose 🙂 Thanks for the excellent vid as usual.
14:05 matrix multiplication cant be rearranged, as matrix multiplication is a sequence of calculations. is this indicated by using X as a multillication symbol and not •? Becaus in school we used • to indicate multiplications.
This is explained, although I'm guessing not to your satisfaction, at 10:58. It has to do with the ability to combine transformations. For more details, see: math.stackexchange.com/questions/271927/why-historically-do-we-multiply-matrices-as-we-do
It just means that PyTorch stores the weights differently than we used in the earlier examples and in order to get the same math, we have to transpose the PyTorch weight matrix.
@@mahdi.ahmadi.2 The weights and biases for all neural networks are obtained with backpropagation. For details on how that works, see: ruclips.net/video/IN2XmBhILt4/видео.html
Hi, I am trying to start a youtube channel to make tutorial videos about data science related topics. I want to make the videos about things that are less popular but still important, since I found that it can be quite difficult to start off with these things since most information is in difficult to comprehend papers. My starting point will be social network analysis and natural language processing as that is my main interest and expertise. However, I am interested in finding more topics so I am starting by doing research on different channels that make tutorials for data science, AI, machine learning, statistics, natural language processing, graph theory or network analysis. So for anybody in the comments that reads this message, could you help me out by replying with any youtube creators that do something related to these topics, or any other digital platform like Brilliant. If you know a topic that is similar to the ones I mentioned that would also be a great thing to share. Or if you know of better places to share this message. Or any other helpfull tips. Thanks everybody for the help. If this message is regarded as spam also please say so and I will remove it. The topics again: -data science -AI -machine learning -statistics -natural language processing -graph theory -network analysis
To learn more about Lightning: lightning.ai/
Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/
Hi, i would like to buy the book in color. Does someone know if that is posible ? It seems to me that in amazon is in black and white and on Lulu is on color. Is that right ? I am from spain.
@@bigbacktor They are all in color. And there is a version that is translated into spanish if you are interested.
@@statquest Thanks a lot!! BAM you just sold another book :)
@@bigbacktor Hooray!!! Thank you very much for supporting StatQuest! BAM! :)
I did a linear algebra algebra heavy undergrad and masters degree. Nowhere in that experience did I get such a clear and succinct description of the transformation matrix and the reasoning behind why our order of operations for matrix multiplication is so idiosyncratic. Josh is a really amazing educator.
Thank you very much! :)
You have been staying the course for a long, long time. It's not so easy ! Keep up the good work!
Thank you!
All visual learners are blessed by the great Josh Starmer!
Thank you! :)
What you do is nothing short of a miracle! Immense gratitude
Thank you!
They say you need Linear algebra to learn Machine Learning.
But in reality all you need is this video and some 2.3 extra concepts for the whole of machine learning
Thank you StatGOD
I, too, am confused why so much emphasis is put on learning linear algebra as a pre-requisite for ml. I hardly ever use anything more complicated matrix multiplication and transposing matrices.
Some people do not understand why I enjoy your courses, because you simply explained everything clearly in its original theory.😁😁😁
Thanks!
this is freaking amazing! Would love to see more math lessons like this
Thank you!
Crisp and clear explanation on matrix multiplication in the context of neural networks. Moreover, the quality of graphics/visual presentation is impeccable!
Thanks!
World is a better place with Josh🎉
Thanks!
Could you please make a video about QLORA? ❤ You're our savior when it comes to understanding complex concepts, thank you man
I'll keep that in mind.
that would be lovely, perhaps LORA itself holds a strong glue potential across neural networks, will be looking forward for such amazing video
I found this channel when searching for a clear explanation of central limit theorem on Google (after doing some simulation in R using sample size much less than 30 and being intrigued by the results I got) and I just want to say I love the content so much! (And the ukulele episode ❤) I’ve recently started some machine learning classes on coursera and EdX, and I must say the explanation you have here in these episodes are SO MUCH BETTER AND MORE TO THE POINT/BETTER DEFINED than the multi thousand dollar classes (I’m surely glad I chose to audit them first!) taught by professors from Harvard or Engineers working for Google/IBM. So much better!… ❤❤❤
Just want to say thank you and Merry Christmas! I know I will be going through these videos one by one in the coming months…
Thank you very much!!! I'm so happy you enjoy my videos. BAM! :)
@@statquestI really did and binge watched a bunch… But I must say I now enjoy your songs even more 😂 Just bought all your albums on bandcamp - they are awesome! That going back to Cali song just had me rolling off my chairs at the end of it… I relocated from San Francisco Bay Area to Florida panhandle not long ago so that song really struck a cord with me 😂😂😂
@@arenashawn772 Thank you very much! I'm glad you enjoy the tunes and the videos. I hope the move went well! :)
"Those memories follow me around!!!"
Nailed it Josh!😂😂
Thanks!
Really amazing work! This set of videos (neural network playlist) has really helped me in my uni coursework and project! My groupmates and I are planning to get a statquest triple bam hoodie each haha!
That's awesome!!! TRIPLE BAM! :)
I was thinking about taking a course to learn matrix algebra yesterday. Thanks for posting this video. It is really helpful and it is like a wish came true.
BAM! :)
Wow, just the perfect video I was looking for! Loved all the Taylor references, and music puns.
Hooray!!! You're the first person to mention the Taylor references in a comment. BAM!!! :)
@@statquest I was looking for it, since you mentioned something with Taylor was coming soon, in one of my Linkedin Posts :) Plus, I've been reading a lot of academic papers lately, So needed a better context on matrix transformations to interpret the math better! So, Double BAM, indeed!
i love your videos, it helped me so much.. learned a lot.. i was able to make UNA thanks to your learnings :)
Triple bam! Congratulations!
Man u deserve a thousand times more subscribers
Thank you!
Baaam this is good :D I have been waiting for this, to be honest, I had the feeling that one day you would make such a tutorial. Your content is great.
Thank you very much!
Best of the best. I'm really speechless now.
Thank you!
Thank you for explaining it so simply even a novice like me can understand it.
Thanks!
Thank you for this video. I think I understand what a transformer is now.
Thanks!
Very nice video! Thank you for uploading such helpful material :). It would be great if you made a video on vector and matrix calculus. These are important topics in NNs too :).
I'll keep that in mind.
Thanks for the great video! Also the topic proposed here would also be super interesting, so I hope you could do it someday
Absolutely fantastic explanation again
Thank you!
Thank you so much Mr StatQuest it was a big BAM! for me
Thank you!
Hi Josh... would you please make a video and explain the differences between different statistical tests like t, z, chi... I want to know the differences and when to use each.
I'll keep that in mind.
Tripple Bam for sure. Amazing explanation.
Thanks!
You will be known and remembered for the next 1000 years ..
bam!
10000000 years
Superrrb Awesome Fantastic video
Thanks 🤗
Quadruple bam! (One bam for me finally understanding)
Hooray! :)
Squatch: So it's all just matrix multiplication?
Josh: Always has been
bam! :)
Your video is just a lifesaver to me and my essay! Could you make a video on the Glove model in NLP?
I'll keep that in mind.
Squatchs too relatable🙏😭💯
bam! :)
Very good video ! You should remake one of the transformer videos with the matrix notation as you done at the end of this vide.
I'm working on it right now. Hopefully it will be ready soon.
@@statquest take your time and thanks you very much, your content is so much valuable !
Quite good. ❤
Thanks!
Joshua your teaching was fantastic, but I couldn't quite grasp the concept.
What time point (minutes and seconds) was confusing?
Please part 2 with more details, and new terms
I'll keep that in mind.
thanks for ur effort, ur videos helped me so much, but could u plz tell us how lghm works
Do you mean Light Gradient Boost? LightGBM?
I mean LightGBM
@@statquest
Triple Bam🎉❤
YES! :)
Could you do a series on "attention is all you need " paper ? Thank you Sir.
This video walks you through the concepts in that paper: ruclips.net/video/zxQyTK8quyY/видео.html
And this video goes through the math: ruclips.net/video/KphmOJnLAdI/видео.html
@@statquest thank you so much!!
Could you explain the math behind a basic liquid neuron and show how it differs from other neuron ?
I'll keep that in mind.
Can you please create a video on multi-modal transformer architecture?
I'll keep that in mind.
I hope he does it, he's our savior when it comes to understanding complex concepts
Hi Josh, Could you do video on time series clustering , and time series analysis please?
I'll keep that in mind.
Can u please discuss about stochastic gradient boosting for classification?. I'm having trouble understanding that 😢
I have a whole series of videos on Gradient Boosting. You can find them here: statquest.org/video-index/
Do more videos related to GAN etc.
I'll keep that in mind.
hello statquest, what software do you use to create your videos ?
(your answer is really useful to me)
I give away all of my secrets in this video: ruclips.net/video/crLXJG-EAhk/видео.html
Hey Josh! I love your channel and I was thinking about buying a study guide. What is the difference between watching one of your playlists and buying a study guide? Do you cover exactly the same in both and buying the study guide is for support/like a donation or is there any difference?
They are the same. The difference is that some people like to have the study guides for offline use or adding their own notes to. In some ways, the study guides are like "cheat sheets" - everything in a video is condensed to about 3 to 5 pages.
You are awsome
Thanks!
I'm `Squatch. `Squatch is Happy!
BAM! :)
Can we book on these concepts as well
I'm writing it right now.
Should we move to kaggle as we learn from these videos? Like solving their exercises to get a feel of the real world machine learning in python to the competitions they host? Or is there anything in between?
You can definitely take a stab at a Kaggle dataset see how things go. However, I also have some videos that walk you through how to do real-world data analyses here: ruclips.net/video/GrJP9FLV3FE/видео.html and here ruclips.net/video/8A7L0GsBiLQ/видео.html
Josh starmer is a swiftie!
Totes! :)
We want yolo series mainly yolov8 from scratch
I'll keep that in mind.
Please Professor, it’s an earnest request. Lots of Love from Bangladesh ❤❤
Ok, it always annoyed me that when you're doing matrix vector (col) multiplication they always write the matrix first, then the vector. It never occured to me until you said so just now that the cols and rows aren't valid tensor operations if you write them the other way round... Doh! It doesn't look nice though.
Btw, why did you use a row vector and a transverse matrix? I would always use a col vector. Col space transforms are the default for me and you can picture the latent space.
The only times I'd use rows is if I have a system of linear equations.
I agree that the matrix * column looks bad. And I chose to do row * matrix because that is what they used in the PyTorch documentation.
@@statquestGlad it's not just me that thinks it looks backwards! :)) But you're of course right; 2x2 * 2x1 is a valid operation whereas 2x1 * 2x2 is, strictly speaking, undefined.
Oh, a tip you may (or may not!) find a useful teaching tool:
I always look at matrix multiplication in terms of a series of dot product operations. Once the student understands that the dot product outputs a scalar expressing the likeness of two vectors (eg whether two normalised vectors pointing the same way) then rather than just mechanically running an algorithm - the student can see that it's plotting the vector in the new space by comparing its likeness to the space's basis vectors one axis at a time. That's why I think it's always handy to see a square matrix as a series of basis vectors.
So, if you're going from an orthonormal basis to one where, say, y is mirrored - {{1, 0}, {0, -1}} - then it's quite apparent why taking the dot product for each spatial dimension will plot the vector upside-down. You could show an image flipping to drive the point home.
I just think that's intuitive and why we're multiplying and adding across columns and rows.
At least that's how I like to see it.
great
Thanks!
What about a video on MAMBA architecture ? That would be really BAAAM
I'll keep that in mind.
Thanks Josh. But naughty, naughty, the stage is not just rotating, it is flipping. Which you can also encode in matrices of course ;-)
I'm not sure I understand what you mean by flipping in addition to rotating as stage left and stage right are maintained through out each change.
@@statquest The drawing of the stage is asymmetrical (one edge is slightly erased). When you did the slides you flipped it instead of rotating it. As a result, Statsquatch is sometimes on one side, sometimes on the other. I know it was not on purpose 🙂 Thanks for the excellent vid as usual.
@@Nono-de3zi I'm still confused because statsquach is always on stage left.
But the *stage* is flipped :-)
@@Nono-de3zi If it was flipped, then wouldn't stage left stay on top and stage right stay on the bottom?
14:05 matrix multiplication cant be rearranged, as matrix multiplication is a sequence of calculations. is this indicated by using X as a multillication symbol and not •? Becaus in school we used • to indicate multiplications.
ah no the x is not signifying order. but I would like that to be visible from writing alone, without the helpful explanation.
i wonder why matrices are turned sideways like that. it would feel easier for me to multiply rows with rows.
This is explained, although I'm guessing not to your satisfaction, at 10:58. It has to do with the ability to combine transformations. For more details, see: math.stackexchange.com/questions/271927/why-historically-do-we-multiply-matrices-as-we-do
after 10000000 years, scientists found fossil record of statquest, then he said " BAM!"
Ha! You made me laugh.
Great video, but i don't quite understand 25:25...
It just means that PyTorch stores the weights differently than we used in the earlier examples and in order to get the same math, we have to transpose the PyTorch weight matrix.
BAM!
:)
Woaw
:)
❤
+1.6 ?! Is it correct?
What time point in the video, minutes and seconds, are you asking about?
@@statquest 21:34
@@mahdi.ahmadi.2 The video is correct. 1.6 is the bias value that we add to -1.0. This gives us a sum of 0.6, as seen at 21:45
@@statquest Actually, my main question is where the Bias numbers +1.6 and 0.7 come from?
@@mahdi.ahmadi.2 The weights and biases for all neural networks are obtained with backpropagation. For details on how that works, see: ruclips.net/video/IN2XmBhILt4/видео.html
This is the easiest chapter for me.😆😆😆
bam! :)
Hi, I am trying to start a youtube channel to make tutorial videos about data science related topics. I want to make the videos about things that are less popular but still important, since I found that it can be quite difficult to start off with these things since most information is in difficult to comprehend papers. My starting point will be social network analysis and natural language processing as that is my main interest and expertise. However, I am interested in finding more topics so I am starting by doing research on different channels that make tutorials for data science, AI, machine learning, statistics, natural language processing, graph theory or network analysis.
So for anybody in the comments that reads this message, could you help me out by replying with any youtube creators that do something related to these topics, or any other digital platform like Brilliant. If you know a topic that is similar to the ones I mentioned that would also be a great thing to share. Or if you know of better places to share this message. Or any other helpfull tips.
Thanks everybody for the help. If this message is regarded as spam also please say so and I will remove it.
The topics again:
-data science
-AI
-machine learning
-statistics
-natural language processing
-graph theory
-network analysis
I know I get a website to learn machine learning by myself.😁😁😁
:)
zhina!
?
97 videos finished … small bam 🥲
Wow!