Matrix Multiplication - Topic 19 of Machine Learning Foundations
HTML-код
- Опубликовано: 8 фев 2025
- In this video from my Machine Learning Foundations series, I’ll demonstrate matrix multiplication - the single most important and widely-used mathematical operation in machine learning. To ensure you get a solid grip on the principles of this key skill, we’ll use color diagrams, calculations by hand, interactive code demos, and an applied learning example.
There are eight subjects covered comprehensively in the ML Foundations series and this video is from the first subject, "Intro to Linear Algebra". More detail about the series and all of the associated open-source code is available at github.com/jonkrohn/ML-foundations
The next video in the series is: • Symmetric and Identity...
The playlist for the entire series is here: • Linear Algebra for Mac...
This course is a distillation of my decade-long experience working as a machine learning and deep learning scientist, including lecturing at New York University and Columbia University, and offering my deep learning curriculum at the New York City Data Science Academy. Information about my other courses and content is at jonkrohn.com
Dr. Jon Krohn is Chief Data Scientist at untapt, and the #1 Bestselling author of Deep Learning Illustrated, an interactive introduction to artificial neural networks. To keep up with the latest from Jon, sign up for his newsletter at jonkrohn.com, follow him on Twitter @JonKrohnLearns, and on LinkedIn at linkedin.com/in/jonkrohn Наука
This video series/playlist is exactly what we spent hours looking for to learn data science/machine learning from scratch. As a total beginner, THIS IS PURE GOLD. Masterclasses with crystal clear explanations from a super educated and experienced professional. Terms are explained in a super clear and interactive way, knowing this is something for true beginners who have no previous knowledge. I love that we get introduced to each topic/term and immediately.
Jon, I mean this: I hope you realise that your content saves peoples' lives. How? You´re literally saving us hours and hours, which keep accumulating, of searching for high-quality free educational content. Now we can spend this irreplaceable time on something else. I immediately liked and commented as soon as each video started to do my part in feeding the algorithm. This should be the top result for every beginner ML search in YT. I could keep showing you my appreciation but I have to keep learning. Thanks for every second spent planning, recording and editing. I wish you the best: professionally, personally, mentally and spiritually (Not in that specific order).
This video have cleared my most of the doubts about linear regression
The video is get intuitive.
Thank you so much. You are awesome ❤
Lots more videos on different implementations of linear regression coming later in the series - it's a valuable technique!
Thanks Jon for making these videos. It's been really fun to refresh my math with you. I'm excited to keep going and learn more about machine learning. Thank you so much for the work you've done to contribute to the learning of others.
Hlo champ 👋👋👋
Thanks you, bro, you have shown where we use matrix multiplication. I got satisfied and want to learn more in the field of AI and ML.
new subscriber jon love tje explanation and the video thank you for putting this out here. hope u get more subscribers!
came here from Coursera to solve my doubts and now i think ill complete my rest of machine learning course w your videos. thank you
Shouldn't it be (at 6:05):?
A = np.array([[3,4],[5,6],[7,8]])
b = np.array([[1],[2]]) # instead of: np.array([1,2])
np.dot(A, b)
This is strange, I never understood this in Python, which version is the correct one.
Very good series!!
Thank you, Gustavo :)
Learning so much from you!
I'm so glad!
really I liked this video,thanks!!
You're welcome :)
The year I guess was 1988 when I was in year 12 and that time I was enjoying doing this matrix multiplications using paper and Pen. Little did I know, after 34 years I will be learning same stuff from Jon and implementing it using Numpy,PyTorch and TensorFlow.
I am pleased I have covered half of the videos. I am going stop for today. Again I have same question was it not possible for TensorFlow to overload the methods for Matrix-Vector and Matrix-Matrix multiplication?
thank you sir
The part on house prices you were showing at the end, was that forward propagation? wow I've never seen it like that
Yes! "Forward propagation" or the "forward pass" is simply feeding our input (typically denoted as x) into our model to produce our output (typically denoted as y). In a deep neural network, this forward pass can involve thousands of nested functions. These ideas are fleshed out in complete detail in Segment 3 (Autodiff) of Subject 3 (Calculus), coming up later in the series.
You are awesome 😀 Thanks
Ha! You're welcome :)
so the dot keyword or matmul is inbuild here and automatically gave the dot product of matrix (or matrix multiplication) no algo. to use🤔
Yep! Piece of cake 🍰
But still, a good idea to understand the underlying math :)
@@JonKrohnLearns
what's the reason for using torch and tensorflow for the operations up until now, when numpy can also perform them?
thanks a lot..>
Most welcome!
It is very outstanding course sir. I love ur method of teaching. May I have ur email or whatsApp.