When I search how to do machine learning from scratch: The videos: So you first do import tensor flow Me: closes video Me finds you tutorial series: I like this one
This is my main problem with virtually every ML tutorial on RUclips that is not a basic introduction. They don’t explain how it works, they just tell you to import a library.
yes..so true. Now a days u will find tons of tutorials about DL and ML and all of them focuses on the importing and application of diff frameworks and libraries, rather than proper deep level intuitions.
@@sentdex I was thinking the same thing, also they look pretty cool! What tool are you using to do them ? I have a friend who is a physics teacher and might be interested in that :) (edit) Maaaan every time I ask a question i have to remove it because you answer it later in the vid or in the next one, awesome ^_^
@@adjbutler I like the post rate right now. Really lets you mull over the new information, which will be especially helpful when we get to the more complicated parts of nn's. Especially considering Harrison is writing his book alongside making this, it's a great amount of content and builds anticipation for each release (something that lacks when I go back through his old playlists and binge them all at once) :D
The moment you mentioned the "shape" problem you became the MVP of youtube machine learning. You clearly remember what it was like to be a first time learner and it shows in your communication style, well done.
I, along with every other viewer following along with this series, want to thank you for making this series. Such a painless way to learn an intricate and exciting topic. The visuals are a great bonus.
Dude this is legit one of the most helpful and intuitive coding tutorials I've ever seen! Some tutorials are really hard to watch but yours is very comprehensive thanks for that I appreciate it.
This is possibly one of the few times I'm glad I took further maths in sixth form, because without going to uni I have covered and understood matrices and vectors in 3D
I am taking a machine learning class and my instructor didn't explain it the way you do and this course is what i want. Thank you for strengthening my weakness, i'll buy your E-book next week!!
As someone with a non-engineering background, i can't decribe how i am enlightened from your videos. Thank you and please keep doing what you are doing!
I'm not normally a fan of animations, but yours are clean and not too colorful, it makes it very helpful to digest the concepts that you're explaining.
You deserve a medal for the best complex-topic-synthesizer. You don't even have to be a high school graduate to grasp the content in these initial 3 videos so far. Kudos bro
well in my mind 1.0 represents something in its full form where any 0.x number would be something partial. so 1.0 is like saying its a full neuron. bit i have no idea what i am talking haha
@@MegaGutemusik The input is always 1 for the bias because 1 is the neutral element of multiplication. This means that you can put the bias into the weights array (usually at index 0) and therefore learn the bias alongside the weights.
I think its worth mentioning how the calculus behind dot product works. If you have 2 matrices (in the shapes: n*p, p*m), then the resulting matrix will be the shape of n*m after the dot product (see how p = p), and the number of "columns" of the first matrix has be equal to the number of "rows" of the second matrix.
the way you break things down is super useful, i cant retain info if i have to many questions about it, my brain just locks up so all the high level explanation videos of neural networks just got me excited but didn't teach me at all. your clear enough you could just call this series, The Understanding the "Understanding Neural Networks" Videos Series!
Finally somebody is starting explaining how NN work so anybody can understand and start building there own Ai thank you very much! You are a good teacher!
These are purely my understanding of weights and biases.... For Example, y = x1 * w1 + x2 * w2 + b x1, x2 => Inputs w1, w2 => Weights b => Biases w1 => Denotes the contribution of x1 to the output w2 => Denotes the contribution of x2 to the output b => Acts as an offset... This is just a Linear Equation, when activation functions are applied => Non Linearity is introduced Why we apply activation functions, Every data can't be just explained by a Linear equation, so we apply activation functions to make them non linear.
So far these are the best video series about Neural Network. The great thing about these videos is that each time you do the same task but with a different and advanced code. For me it's the best way of teaching and I really enjoy watching your videos. Thanks!
I'm a bit confused -- here, weights is a 3x4 matrix, and inputs is a 1x4 matrix. Strictly speaking, wouldn't the dot product only work if inputs is an n-by-m matrix (4x1 in this case), where m is the number of samples, as opposed to what's shown here? Looks like NumPy is smart enough to perform the dot product to a rank 1 vector even when the shape mismatches.
Your confusion starts @ shape. First, the input shape is not 1x4, it's of shape (4,). Also, it's not a matrix. I think you might want to watch that shape section again. Also, you can always confirm shapes in numpy. You might want to tinker about until you feel solid at knowing something's shape. For example: >>> import numpy as np >>> x = np.array([1,2,3,4]) >>> x array([1, 2, 3, 4]) >>> x.shape (4,) >>> y = np.array([[1,2,3,4],[5,6,7,8]]) >>> y.shape (2, 4) >>>
In numpy when you say a vector it by default takes column vector.So input vector shape is (4,1) as per numpy even we have declared it as a row vector whose shape is (1,4).
I seriously cannot thank you enough. I'm a java programmer but you have made this so simple to understand I'm able to implement it and I'm not getting lost. Hats off to you good sir... thank you for these videos!! You are truly doing a great Public service!
I would very much love the physical book, sadly my money situation is pretty much nothing at the moment, so I can't buy it...but even if I finish this series before buying the book, I still plan to purchase it at some point.
This is the way to teach. I'm sure this series will take my scattered knowledge about neural networks and make it into an actually useful skill. I'm so thankful that this is being uploaded on youtube for free so that broke people like me can learn too. The first money I earn from a job is going to you my man. Great work.
@@sentdex one more question, I have mostly been coding in C/C++ up until now but seeing how easy and verbose python is it's time to learn. I see you many tutorials on your channel, any playlist that you recommend which will help me brush up the basics and follow along this series?
4 года назад
@@tanmaygoel9435 pythonprogramming.net/python-fundamental-tutorials/ - here, basics and intermediate fundamentals, whatever you need :)
I have that much joy for a while learning something with that much CLARITY on RUclips! thank you so much for all the efforts you guys have put in to this, its awsome!
The best neural network guides ever! Thank you! I always look at some videos and people are just so ignorant to explaining some 'basic' (as they think) stuff that may seem obvious to them... but if someone is starting from the scratch it is super helpful and saves TONS of time that would have to be spend to research all of it on a side. Again thanks, you're awesome
Super excited about these videos. You're excellent at explaining things, and I'm happy to preorder the book to support you creating these tutorials! Keep at it, we're all learning leaps and bounds because of you.
@@sentdex Lol, I would if I knew enough assembly. I'll stick to contributing in Kotlin for now. P.S: Jeez your fast at merging pull requests, I assumed it would take like a day cause you would be busy.
I want to get started with neural networks in general and this series is really helpful in explaining, the other tutorials are much harder. You seem to explain it very well, making sure we (the viewers) understand the base stuff before getting onto the actual making of the real network
You are helping a ton of people (like me) who can write working code but don't quite understand the granular workings. I will buy gladly your book to glean some more details. Thank you for sharing this amigo.🙂
These videos are incredible. Working through Andrew Ng's older intro to ML Course, but in Python not Octave. Not much background in linear algebra, but stronger in Python. Building from the ground up -- learning math by coding -- this is the best way to learn.
The way you highlighted the bias, weight and activation function through animation is just extraordinary, kind of enlightenment. Thank you so much. It was deeply helpful
The best videos on neural networks in youtube. Simple explanation and super easy to grasp. 'Enlightenment' is the word after watching this. Thank you so very much :)
Thanks for the new upload! I didn't realise how flexible the dot product function can be - in class we only used it for vector values, and didn't even touch upon the idea that you can use matrixes with multiple dimensions as inputs. This is already super useful information. Again, thanks! :)
Thanks so much for these series! I had so much trouble jumping into neural networks without understanding everything happening "under the hood" so to speak. I just always felt like I was just assembling one of those pre-designed lego sets without understanding the thought behind it.
COMING FROM ENGINEERING BACKGROUND MYSELF, I found very impressive how you use the straight line equation and visualiSation to make understand the meaning of BIAS AND WEIGHT, THANK YOU LEGEND SENTDEX
One of the most direct explanations I've ever seen... before this, the whole time I was thinking this may be too esoteric of a predictive tool for me to learn well, wow
This series is lovely!! It feels so good to actually understand the basic mathematics along with some practical programming. There are so may other resources that either focus completely on the mathematics part (which after a certain point start to feel like jargon) or others that just focus of using the libraries like pytorch (which begin to feel like copying and pasting after a certain point). Thanks for doing this dude!
These tutorials actually help understand the need for math more because i hated it in highschool. But now that i can see how it is applied i understand how and why to use it which makes it so much more intresting to learn. Thanks a lot for these videos!
You explain in such a nice and easily understandable way and plus those animations are very helpful. It's so helpful content and because of you only I'll pass my college exams Thanks
i have taken linear algebra course at the uni , i hopped these animations exist that time. I felt very happy when i Linked the concept of np.dot() with the multiplication of matrices, then i figured out how to get the output of np.dot(multi_dim1, multi_dim2) by using the transpose . thank you very much .. you are developing my understanding
Awesome work on the Animations Daniel. Harrison, you got heart man. The way you explain by taking so much of time ensuring that every little things are conveyed, simply amazing. Respect and ton of Thanks with all my heart.
Sentdex, you are awesome. I love this tutorials and im saving money to buy your book. I never tought that i could learn AI for myself but now im in love with this part of the programing world
I already know NN as I work with it every day, but it's awesome to see different aprouchs, loving the animations, loving the serie. Keep it coming man👌
Great work man! Seriously, became a huge fan of your work, your way of making things understandable is one which is the most admirable...ev'rything just gets clearer if one has the understanding of basic mathematics...and if not, that's what you're there for...you videos really are a treat👏.
Many thanks for the videos. You are the best tutor that I have come across for deep learning. The animations help us understand the concept even better. Looking forward for your upcoming videos.
Amazing brother, I can't believe I can actually follow this course, easily too! Sorry I can't afford the book to support you more, but you get all the likes and subbed. Btw zeroeth is my new favorite word.
Brilliant videos! The level of instruction here is fabulous. I was going to compliment the animations but you already covered it at 23:58. Thanks for posting!!
I complained at the slow pace of the previous episode. It's getting much better to my taste, I even had to pause a few times to check the code. Cool. And indeed, congrats to Daniel, those animations are excellent!
When I search how to do machine learning from scratch:
The videos: So you first do import tensor flow
Me: closes video
Me finds you tutorial series: I like this one
This is my main problem with virtually every ML tutorial on RUclips that is not a basic introduction. They don’t explain how it works, they just tell you to import a library.
@@farenhite4329 yep exactly
@@farenhite4329 Its because they themselves dont understand how it works
yes..so true. Now a days u will find tons of tutorials about DL and ML and all of them focuses on the importing and application of diff frameworks and libraries, rather than proper deep level intuitions.
Please don't abandon this series! RUclips is begging for a tutorial this clear and concise about neural networks! Thankyou!
Wouldnt think of it!
@@sentdex pls pls pls finish it
@@sentdex you will make great contributions to society if you continue this series
@@sentdex are you planning to add new episodes? 🙏
@@sentdex you not only thought of it, you made it :'(
The animations are so helpful!
Glad you like em!
@@sentdex I was thinking the same thing, also they look pretty cool! What tool are you using to do them ? I have a friend who is a physics teacher and might be interested in that :)
(edit) Maaaan every time I ask a question i have to remove it because you answer it later in the vid or in the next one, awesome ^_^
@@Kawabolole he is using maim by 3b1b
This comment are so not helpful
@@sentdex How do you do this?
i knew u would upload in 7 days.
I kinda counted days left like a child.
We tried so hard to upload faster. Still trying xD
@@sentdex i think its good so we can feel the value
i second this. my only complaint is that these videos are comming out too slowly. pls help us!
@@sentdex 4 days break will be good for a video.
@@adjbutler I like the post rate right now. Really lets you mull over the new information, which will be especially helpful when we get to the more complicated parts of nn's. Especially considering Harrison is writing his book alongside making this, it's a great amount of content and builds anticipation for each release (something that lacks when I go back through his old playlists and binge them all at once) :D
The moment you mentioned the "shape" problem you became the MVP of youtube machine learning.
You clearly remember what it was like to be a first time learner and it shows in your communication style, well done.
I, along with every other viewer following along with this series, want to thank you for making this series. Such a painless way to learn an intricate and exciting topic. The visuals are a great bonus.
My pleasure!
Never been so excited after seeing a RUclips notification! P3 nnfs, BOOM!
Been there ! Done that ;)
Sentdex drops a new vid. “Wife grab the kids I have a work emergency”
Dude this is legit one of the most helpful and intuitive coding tutorials I've ever seen! Some tutorials are really hard to watch but yours is very comprehensive thanks for that I appreciate it.
Man, the animations are in my opinion fundamental for the full understanding of the content. Huge thanks to Daniel who's done them.
This is possibly one of the few times I'm glad I took further maths in sixth form, because without going to uni I have covered and understood matrices and vectors in 3D
Watching this tutorial at the same time as I go through your pytorch tutorial. My head blows up of all the new things
Hah, good luck sir :D
Never been so exited for a video to come out on RUclips.
what you're making here will be the definitive ML from scratch guide in a few years, calling it now
21:00 - 22:00 the best minute I've watched on youtube since long time ago
I am taking a machine learning class and my instructor didn't explain it the way you do and this course is what i want.
Thank you for strengthening my weakness, i'll buy your E-book next week!!
How is anyone down voting these? These are fantastic and animations are so incredibly helpful.
As someone with a non-engineering background, i can't decribe how i am enlightened from your videos. Thank you and please keep doing what you are doing!
you dont need an engineering background for this lol
Man I just want to watch all the videos since i have so much free time during quarantine. Might have to go to the book
Wish we could make these videos faster, doing our best :D... but yes, book should keep you busy for a while!
I'm not normally a fan of animations, but yours are clean and not too colorful, it makes it very helpful to digest the concepts that you're explaining.
this has been really helpful so far but i really need to wait until the series is done because i forget everything in-between episodes
You deserve a medal for the best complex-topic-synthesizer. You don't even have to be a high school graduate to grasp the content in these initial 3 videos so far. Kudos bro
Almost a 1000 views in 30 minutes, shows how much we love this🔥❤️
Have been visiting your channel every day since Part 2, was not disappointed today haha
Thanks sentdex!
Glad you enjoy it!
This might just confuse some people, but it helped me: The bias is essentially just another weight, for an imaginary input whose value is always 1.0
well in my mind 1.0 represents something in its full form where any 0.x number would be something partial. so 1.0 is like saying its a full neuron. bit i have no idea what i am talking haha
@@MegaGutemusik The input is always 1 for the bias because 1 is the neutral element of multiplication. This means that you can put the bias into the weights array (usually at index 0) and therefore learn the bias alongside the weights.
The animation at the end is most helpful
Those animation are being freaking awesome and truly helping me to understand what's happening in the code.
I think its worth mentioning how the calculus behind dot product works. If you have 2 matrices (in the shapes: n*p, p*m), then the resulting matrix will be the shape of n*m after the dot product (see how p = p), and the number of "columns" of the first matrix has be equal to the number of "rows" of the second matrix.
Exactly what I was going to say
except that's not really calc lmfaoo but elementary linear algebra
Never have I wanted the next episode more in a series. Thank you for these videos.
these animations are awesome ive been thinking that the whole time.
Thank you
@ they are very helpful to visualize the concept. Keep it up. Kudos to both of you.
@ what kind of software do you use for animation?
the way you break things down is super useful, i cant retain info if i have to many questions about it, my brain just locks up so all the high level explanation videos of neural networks just got me excited but didn't teach me at all. your clear enough you could just call this series,
The Understanding the "Understanding Neural Networks" Videos Series!
I wish game of thrones seasons 7 and especially 8 were as good as this tutorial
Incredibly clear, amazing teacher, when you can simplify to that level means you have true mastery of your material, thank you!
*The answer is of course to use loops*
Laughs in functional programming
Finally somebody is starting explaining how NN work so anybody can understand and start building there own Ai thank you very much! You are a good teacher!
These are purely my understanding of weights and biases....
For Example,
y = x1 * w1 + x2 * w2 + b
x1, x2 => Inputs
w1, w2 => Weights
b => Biases
w1 => Denotes the contribution of x1 to the output
w2 => Denotes the contribution of x2 to the output
b => Acts as an offset...
This is just a Linear Equation,
when activation functions are applied => Non Linearity is introduced
Why we apply activation functions,
Every data can't be just explained by a Linear equation, so we apply activation functions to make them non linear.
Dumb question what does non linearity mean
@@taran7954 something that can't be fit by a straight line
@@taran7954 non (not) linear (line) means it's not a line
So far these are the best video series about Neural Network. The great thing about these videos is that each time you do the same task but with a different and advanced code. For me it's the best way of teaching and I really enjoy watching your videos. Thanks!
Anyone who stumbled upon these lectures in 2023?
Entered the black hole of trying to do this in a programming language other than python.
Im in 2024 actuallly
Just found it myself
2024 here!
2024
I do not know why would anyone dislike this very nice video,
The animation is very nice and makes the explanation clearer.
Thank you so much
I'm a bit confused -- here, weights is a 3x4 matrix, and inputs is a 1x4 matrix. Strictly speaking, wouldn't the dot product only work if inputs is an n-by-m matrix (4x1 in this case), where m is the number of samples, as opposed to what's shown here? Looks like NumPy is smart enough to perform the dot product to a rank 1 vector even when the shape mismatches.
Your confusion starts @ shape. First, the input shape is not 1x4, it's of shape (4,). Also, it's not a matrix. I think you might want to watch that shape section again.
Also, you can always confirm shapes in numpy. You might want to tinker about until you feel solid at knowing something's shape.
For example:
>>> import numpy as np
>>> x = np.array([1,2,3,4])
>>> x
array([1, 2, 3, 4])
>>> x.shape
(4,)
>>> y = np.array([[1,2,3,4],[5,6,7,8]])
>>> y.shape
(2, 4)
>>>
@@sentdex Your explanation makes perfect sense!! The example really helps clarify. Thanks!
I have the Same Question
Weights is (3, 4) and inputs is (4, ), so the product becomes (3, ).Hope it helps!
In numpy when you say a vector it by default takes column vector.So input vector shape is (4,1) as per numpy even we have declared it as a row vector whose shape is (1,4).
I seriously cannot thank you enough. I'm a java programmer but you have made this so simple to understand I'm able to implement it and I'm not getting lost. Hats off to you good sir... thank you for these videos!! You are truly doing a great Public service!
I would very much love the physical book, sadly my money situation is pretty much nothing at the moment, so I can't buy it...but even if I finish this series before buying the book, I still plan to purchase it at some point.
Hope your situation improves!
@@sentdex please share the link to the book
This is the way to teach. I'm sure this series will take my scattered knowledge about neural networks and make it into an actually useful skill. I'm so thankful that this is being uploaded on youtube for free so that broke people like me can learn too. The first money I earn from a job is going to you my man. Great work.
I hope this series can do just that for you!
@@sentdex one more question, I have mostly been coding in C/C++ up until now but seeing how easy and verbose python is it's time to learn. I see you many tutorials on your channel, any playlist that you recommend which will help me brush up the basics and follow along this series?
@@tanmaygoel9435 pythonprogramming.net/python-fundamental-tutorials/ - here, basics and intermediate fundamentals, whatever you need :)
Will you implement some sort of autograd later on in the series? Loving the videos btw
I have that much joy for a while learning something with that much CLARITY on RUclips! thank you so much for all the efforts you guys have put in to this, its awsome!
video_value = True
while (video_value):
print("Finally! I bought the book also!")
The best neural network guides ever! Thank you! I always look at some videos and people are just so ignorant to explaining some 'basic' (as they think) stuff that may seem obvious to them... but if someone is starting from the scratch it is super helpful and saves TONS of time that would have to be spend to research all of it on a side. Again thanks, you're awesome
I've never heard that a "list of lists" is a "lol" before, lol!
This series has been amazing. Also, being a highly visual learner ... the animations really take things to the next level for me! Thank you both!
The moment I have been waiting for: "Watching sentdex's latest video" .
Super excited about these videos. You're excellent at explaining things, and I'm happy to preorder the book to support you creating these tutorials! Keep at it, we're all learning leaps and bounds because of you.
Thank you for the support!
smh, there is no spot for assembly in the github
Sounds like you need to make the assembly version! I'll wait for your PR :)
@@sentdex Lol, I would if I knew enough assembly. I'll stick to contributing in Kotlin for now. P.S: Jeez your fast at merging pull requests, I assumed it would take like a day cause you would be busy.
And just like that someone made a PR for assembly
I want to get started with neural networks in general and this series is really helpful in explaining, the other tutorials are much harder. You seem to explain it very well, making sure we (the viewers) understand the base stuff before getting onto the actual making of the real network
Glad you feel that way so far! Hope that continues!
to who ever wrote it in assembly on github: why, why you do dis to yourself?
Xdxdxd ......
O my god, thats a clean and simple explanation. Even 10yr old kid can understood if the tutorial is from sentdex. Eagerly waiting for next videos
you are called "semtex" for me and i refuse to properly read your name ever again
You are helping a ton of people (like me) who can write working code but don't quite understand the granular workings. I will buy gladly your book to glean some more details. Thank you for sharing this amigo.🙂
These videos are incredible. Working through Andrew Ng's older intro to ML Course, but in Python not Octave. Not much background in linear algebra, but stronger in Python. Building from the ground up -- learning math by coding -- this is the best way to learn.
You and Danial are the best, you are just putting the learning data inside my head omg
I LOVE this series, This helps me hate college maths a little less
The way you highlighted the bias, weight and activation function through animation is just extraordinary, kind of enlightenment. Thank you so much. It was deeply helpful
The best videos on neural networks in youtube. Simple explanation and super easy to grasp. 'Enlightenment' is the word after watching this. Thank you so very much :)
It took me 10 minutes of googling to figure out how to deal with pip, but I got it in the end. Once again these videos are incredibly inspiring.
Thanks for the new upload!
I didn't realise how flexible the dot product function can be - in class we only used it for vector values, and didn't even touch upon the idea that you can use matrixes with multiple dimensions as inputs. This is already super useful information. Again, thanks! :)
You are welcome!
By far one of the best videos that I have seen in ML.
THANK YOU VERY MUCH.
Thanks so much for these series! I had so much trouble jumping into neural networks without understanding everything happening "under the hood" so to speak. I just always felt like I was just assembling one of those pre-designed lego sets without understanding the thought behind it.
COMING FROM ENGINEERING BACKGROUND MYSELF, I found very impressive how you use the straight line equation and visualiSation to make understand the meaning of BIAS AND WEIGHT, THANK YOU LEGEND SENTDEX
One of the most direct explanations I've ever seen... before this, the whole time I was thinking this may be too esoteric of a predictive tool for me to learn well, wow
I have been waiting for this for so Long.. it felt like years since then the last video came ...
We're trying to put them out even faster, and can't seem to do it lol. Will keep trying though!
My son is studying AI & computing at University. Your videos give us an opportunity to get together and discuss the topics. Thanks.
Cool to hear!
This is great material. I don't know how is this free on RUclips. Keep up the great work.
I really appriciate you explaining the math behind this. I like to deeply understand what I do and why it works. Thank you for the video :D
This series is lovely!! It feels so good to actually understand the basic mathematics along with some practical programming. There are so may other resources that either focus completely on the mathematics part (which after a certain point start to feel like jargon) or others that just focus of using the libraries like pytorch (which begin to feel like copying and pasting after a certain point). Thanks for doing this dude!
Glad to hear you like the style!
why this series is so much better then every other one. And why he is the only one that explain the math
OH MY GOD I LOVE YOU! The explanation you gave for them 3D arrays was the AHAAAA!! moment for me.
These tutorials actually help understand the need for math more because i hated it in highschool. But now that i can see how it is applied i understand how and why to use it which makes it so much more intresting to learn. Thanks a lot for these videos!
These videos are so clear and easy to understand it's crazy. My CS professors should aspire to be this good at explaining things.
I am so hyped for the series. Indeed the animations are very helpful for understanding the concepts. Looking forward to watching the sexy part!
I like the comparison between the linear line ( y = mx + b) and the "Neuron Formular" inputs*weights + bias
Great job so far keep it up
He is better than my math teacher ever was.. Understood NN and even math at one go..
I really didn’t like this guy before, but I have to say he was awesome in this series. Congratulations man.
Was eagerly waiting for this video.
I checked my RUclips so many times since past 2 days.
That feeling when you actually understand this. Thanks for breaking this down so well.
I've been waiting for this video to drop for the past couple days lol. Let's finish this!!!!!
Enjoy
LOVE THIS it really taught me about python and ML I didnt know a lot about inputs and outputs until i watched these videos thx bro
You explain in such a nice and easily understandable way and plus those animations are very helpful.
It's so helpful content and because of you only I'll pass my college exams
Thanks
i have taken linear algebra course at the uni , i hopped these animations exist that time. I felt very happy when i Linked the concept of np.dot() with the multiplication of matrices, then i figured out how to get the output of np.dot(multi_dim1, multi_dim2) by
using the transpose . thank you very much .. you are developing my understanding
Awesome work on the Animations Daniel. Harrison, you got heart man. The way you explain by taking so much of time ensuring that every little things are conveyed, simply amazing. Respect and ton of Thanks with all my heart.
Sentdex, you are awesome. I love this tutorials and im saving money to buy your book. I never tought that i could learn AI for myself but now im in love with this part of the programing world
Recently subscribed to 3b1b and you. Glad to see you appreciating each others work and its feeling proud to learn from amazing teachers like you.
Someone have to give you a nobel prize for the best explaination on earth
I already know NN as I work with it every day, but it's awesome to see different aprouchs, loving the animations, loving the serie. Keep it coming man👌
Glad you like them!
Great work man! Seriously, became a huge fan of your work, your way of making things understandable is one which is the most admirable...ev'rything just gets clearer if one has the understanding of basic mathematics...and if not, that's what you're there for...you videos really are a treat👏.
Hats off to you pal. great content helping millions of people (10000000000000000000000000000/10) some real respect to you
Top notch on the animations. I'm still here. The over/under for when I'm completely lost is Lecture 6.
Thankyou 3Blue1Brown and Daniel and especially to you sentdex. Really apreciatte work you are putting here!!!
Many thanks for the videos. You are the best tutor that I have come across for deep learning. The animations help us understand the concept even better.
Looking forward for your upcoming videos.
Wow! Nice video, very easy to understand for beginners like me. Shout out sa animator, very helpful nga talaga, lalo na sa part ng "Shape". Thanks!
Amazing brother, I can't believe I can actually follow this course, easily too! Sorry I can't afford the book to support you more, but you get all the likes and subbed. Btw zeroeth is my new favorite word.
Brilliant videos! The level of instruction here is fabulous. I was going to compliment the animations but you already covered it at 23:58. Thanks for posting!!
I complained at the slow pace of the previous episode. It's getting much better to my taste, I even had to pause a few times to check the code. Cool. And indeed, congrats to Daniel, those animations are excellent!
Thank you for sharing this updated view :]