Hey everyone! Hope you enjoyed the video and it's helpful for you! Make sure to SMASH that like button and Subscribe to help the channel grow. The next video will be sickkkkk! See you then.
I belive you have thought so many artificial brains that you know how to get information in the most slow human brains out there. You repeat the fundamentals with a different tone, get letters on screen and give it time to absorve ensuring there is no overfit or underfit in my learning today. Great job and you got a new subscriber.
Thanks for this video. I would say validation set, cross-validation sets or resamples instead of testing. But the main ideas are the same. I only use the holdout set once for the last fit and some people can misinterpret some concepts here.
Absolutely! This is exactly what a validation set looks like. I wanted to keep it simple and talk about only 2 sets for the sake of the video, but you are correct.
why my loss curve start very low at begining epochs (at epoch 02: loss 0,007) but when I see evaluate mecrics (MAE, RMSE) very bad ( 0,94 ; 1,12)? I'm doing regression problem, please help me!
This is one of the best Machine Learning channels I've seen. Thanks Santiago 🙏🏻 You have a new subscriber. I came here from Twitter, your content there is super good❤️🙏🏻 Please keep making explanatory videos with simple language, so anyone can understand. Thank you again🙏🏻
Qn. Can we use the concept of overfiting to understand how good is our training data? Eg. What if we can not overfit our training data? Can we say our data is not good enough to train the model?
Q. If my problem was binary classification problem and Im using built in model like decision tree or random forest. How to get the loss function ? And If it is not possible or not logical, does drawing learning curve with y axis as accuracy and x axis as number of training samples can still identify the overfitting and underfitting ?
It usually means the test set is too easy for the model. You might have too few test samples, or the samples might be both in the training and testing sets, or they might be too simple for the model.
Please tell what should be the max difference between Val loss and training loss so that model is not overfitting, my model is showing training loss - 1.46 x 10^(-05) and Val loss - 0.018. So is the model overfitting ? Anyone reply
oh no,... someone must have complained about the sound right?🤨... I'm going to miss the loud videos, I loved them louder. It always woke up my attention😒😒
Anyone else like me who has a decent set of speakers is probably greatly annoyed and distracted my the base thumps you put in the background. I have no idea why you would put that in the audio stream. The video explanation was well done but I couldn't continue watching for the reason I mentioned.
Hey everyone! Hope you enjoyed the video and it's helpful for you! Make sure to SMASH that like button and Subscribe to help the channel grow. The next video will be sickkkkk! See you then.
My left ear really loved this video
fr
My left ear loved this video.
thanks bro, I thought my headphones were the problem
if you can't explain it to a 6 year old, you don't understand it yourself. brilliant video
I have read a lot of articles and watched a few tutorials. But THIS is the perfect explanation for beginners in the ML field. Thank you very much.
What an amazing explanation ! It shows how well you yourself understood it - so glad I found you on Twitter !!
Thanks! Really appreciate it!
Straigtht to the point. I honestly like how you talk more of theory and analysis rather than code.
Thanks!
That was easily the best explanation of learning curves. I have seen each of those, except the perfect curve, but I will keep trying!
I belive you have thought so many artificial brains that you know how to get information in the most slow human brains out there. You repeat the fundamentals with a different tone, get letters on screen and give it time to absorve ensuring there is no overfit or underfit in my learning today.
Great job and you got a new subscriber.
Best explanation ever on RUclips! Keep it up man!
Excellent analogy 👏 Thank you very much Santiago!! Your videos are so cool and to the point !!
Glad you like them, Javier! Really appreciate your comment.
Thanks for this video. I would say validation set, cross-validation sets or resamples instead of testing. But the main ideas are the same. I only use the holdout set once for the last fit and some people can misinterpret some concepts here.
Absolutely! This is exactly what a validation set looks like. I wanted to keep it simple and talk about only 2 sets for the sake of the video, but you are correct.
Best explanation on the entire youtube
Superb explanation of the what is the problem and how to approach solving it.
Glad it was helpful, Paul! Really appreciate your comment.
Great quality of information and really precise. So helpful for a beginner like me
Thanks, Victor! I'm glad this is helpful!
Teaching style is too unique and too good 👍
Great explanation with lots of illustrations, simply a very good job, keep going.
This was very helpful but how do we define 'high' and 'low' loss? It's relative I assume but is there some rule of thumb?
Can't describe how helpful and beautiful this video is, simply Amazing.
this is so good. so much important information in short time. big thanks. :)
YES!!!! You just solved a problem I ran into years ago!
Thank you! That helped a lot!!
why my loss curve start very low at begining epochs (at epoch 02: loss 0,007) but when I see evaluate mecrics (MAE, RMSE) very bad ( 0,94 ; 1,12)? I'm doing regression problem, please help me!
understood evey bit of it, well done brother ❤❤
This is one of the best Machine Learning channels I've seen. Thanks Santiago 🙏🏻 You have a new subscriber. I came here from Twitter, your content there is super good❤️🙏🏻
Please keep making explanatory videos with simple language, so anyone can understand.
Thank you again🙏🏻
Thanks
This was so helpful! Thank you
Best Explanation Ever!!!!!!!!!!!!!
Qn. Can we use the concept of overfiting to understand how good is our training data? Eg. What if we can not overfit our training data? Can we say our data is not good enough to train the model?
If you can't overfit your model, you might have a problem with the data, yes (it could also be a problem with the model itself, of course.)
@@underfitted AFAIK, if a model is to simple then we cannot overfit it with the dataset. Right?
Great explanations, especially useful when you work on your own dataset rather than the Kaggle ones.
One of the things that Kaggle taught me is how to avoid over-fitting.
Q. If my problem was binary classification problem and Im using built in model like decision tree or random forest. How to get the loss function ? And If it is not possible or not logical, does drawing learning curve with y axis as accuracy and x axis as number of training samples can still identify the overfitting and underfitting ?
Absolutely brilliant!
Glad it helps!
Short-Sharp-Understandable
what does it mean when test loss is lower than training loss?
It usually means the test set is too easy for the model. You might have too few test samples, or the samples might be both in the training and testing sets, or they might be too simple for the model.
Great Insights. Very helpful, Thank you.
Glad it helped!
Like this so much 👍👍👍
Thanks
This is too good❤❤
That beat 💓
holy shit u just explained this so well
Hey! Solid video, had a slight recommender; balance the audio out a bit, I think L is 10-20% louder than R.
Thanks, CB! Yeah, I’ve been trying to improve the audio for a few videos now. I think the last video came out better? Thanks for the feedback!
Please tell what should be the max difference between Val loss and training loss so that model is not overfitting, my model is showing training loss - 1.46 x 10^(-05) and Val loss - 0.018. So is the model overfitting ? Anyone reply
great explanation
oh no,... someone must have complained about the sound right?🤨... I'm going to miss the loud videos, I loved them louder. It always woke up my attention😒😒
They did! But now they aren't clipping, which was a problem.
Oriented - great job👍
amazin explanation thanks
YAA - You Are Awesome
hey man im not understand roc curves inn logreg
So top suggestion for a ML book
A-mazing Video
Anyone else like me who has a decent set of speakers is probably greatly annoyed and distracted my the base thumps you put in the background. I have no idea why you would put that in the audio stream. The video explanation was well done but I couldn't continue watching for the reason I mentioned.
That’s the type of thing you do when you are learning.
17 minutes than i've seen in years.
Thanks
İndian dominance is so much high in IT now, white guys make educational videos in indian accent.
What?
read about islam man