Hello Josh, Thank for clearly example about Heart Disease. I have a question In your result, comparation between 2 algorithms A,B ( Logistic, K-neighbors, ...) for training always have same logic: If TP of A > TP of B , so TN of A > TN of B => easy to choose a better algorithm (A) I am confused that this logic is true or random in your example, isnt it? Because if TP of A > TP of B and TN of A < TN of B, I am not confident to choose A. ( I think TP and TN have same profit in validation) Thank you and Have a good day!
Understood the concept within 2 mins, something my professor was not able to achieve in a three hour lecture, and this video is free to watch. Thank you for the informative and extremely helpful video!
This is the magic of RUclips. Shorter videos are often more information-dense and have "less noise" -> Less text = less things that you will mistakenly focus on instead of the true Banger concept. Teachers spend 30s explaining what we will learn. Everybody misses this part and are confused for th rest of the course. Here, we have a full 2-5min explanation of what we are talking about, then we delve in the subject.
This is me watching this 5 years after graduating from higher institution and I am understanding it completely for the first time....school was truly annoying, they made even the simplest things feel so complicated and hard to understand
"In this case the machine learning algorithm didn't do very well, but can you blame it?, These are all Terrible movies!!... BAM" I laughed like hell. Love StatQuest.
If my software venture succeeds, then I hugely owe it to you. Your pedagogical approach in the topic will benefit many individuals/teams like me. Thank you, sir!
Had to sign up as a member of StatQuest and support your initiative. Thank you so much for making all this content available. Greetings from Brazil, Josh!
I'm now double certified in data science, and singly so in digital forensics. I began with stat quest, and I keep using it for refreshers if I have a meeting or interview. Here I am, prepping for possible interview questions that come up a lot.
@@taheralipatrawala7300 no, i was in a coding boot camp but they kept dropping the ball so i was stuck waiting for long periods of time while they fixed mistakes and put the curriculum on hold so i enrolled in another course and they both ended up finishing around the same time. Then i enrolled in something else after, all the while i have a growing project portfolio full of cool projects people like discussing, which is pretty big if you are good at the interview stage. Tech is an interesting field in this respect: experience and education are both good, but both is great.
1:30 To test how each machine learning model performs with our test data, we can use a confusion matrix. Let’s say we have two outcomes from our classification model, either true or false, then the rows can be the predicted values for our model (true and false) and the columns can be the actual truth results of our data (true and false) Data points can be classified by our model as true positive (both predicted and actual are true) or false positive (predicted is true but actual is false) or false negative (predicted is false but output is true) or true negative (predicted is false and actual is false) 4:30 using confusion matrix to compare random forest and logistic regression. 5:15 confusion matrix for multiple class classification, the diagonal represents number of true positives
Came across Statquest after almost giving up on learning ML. Things are starting to make sense and are fun now! Amazing channel with amazing explanations!
I realize you should know: half of the time I like your videos, it's not because of the teaching. It's just me loving your songs and how you whisper excitedly "StatQuest!" at the beginning. It's the best!
Thank! I speak spanish and I'm learning english. Your videos are amazing and I'll tried to make videos of statistics in spanish. You are an inspiration to me thanks soo much
Note! In skelearn, confusion_matrix is the other way round. Actuals are on the left and predicted on top. Just in case your model preforms amazing.. for some reason :))
Oh my goodness, did someone else watch Cool as Ice?....I feel sorry for you, I wake up in cold sweats over that movie! btw; I have watched 3 of those movies...I don't know what I'm doing with my life
Woah, this whole confusion matrix finally made sense in my head, only took 6mins! I watch at 1.25x speed ;) gotta learn fast! Thanks a lot Josh, subscribed!
It would be interesting to identify the false positives and negatives that the random Forest and k-means algorithms that were the same, if any. If there were many, then that would indicate that the metrics used were in the analysis needed tweaking.
That depends. Sometimes rows are "actual" and columns are "predicted", sometimes (like in this video, in the wikipedia article on confusion matrices and in the R programming language) it is the other way around.
Sir, I am from India , I was enrolled in a Data Science programme which costed me 45k the python and SQL courses were nice but then also I had to put a lot of effort mastering it , but the ML classes was not at all good, currently my class is going on ML when I am writing this comment but I am not attending it and rather watching your video , I don't want to waste my time there just to satisfy myself that I have paid a huge amount of money rather I want to utilise my time into the video where I can really understand something. Lastly I want to thank you for providing such awesome content , currently I am not working so I have some limitation for spending money, but when I will get a job in data Science and ML I will surely contribute to this channel ."Thank You Again ....." Love from INDIA . " Jai Shree RAM "
I just want to say thank you so much man, i've been sucking ass in my machine learning class, and you've helped me out so much, I love how you explain things.
Hey, nice job. I am wondering which software you are using to make the slides and videos. Is that PowerPoint or something else? I like the way you presenting
My terrible joke apart, your explanations are wonderful. I can't thank you enough! Done once, forever and for everyone. Who knows, your name might survive the corroding waves of history. A little something often takes unexpected proportions, even long after the author is gone.
Python does it one way, R, and the wikipedia article on confusion matrices, does it another. en.wikipedia.org/wiki/Confusion_matrix So it's good to be able to adjust to either configuration.
I rewatched the bootcamp recording which explained confusion matrix for 20 minutes and was still left confused until statquest came in an explained the concept in less than 7 minutes. BAM !!!
For continuous data we often use the "sum of the squared error" or the "mean squared error". If you measure the distances between the observed and predicted values, square them and add them up, you get the "sum of the squared error". If you then divide that sum by the number of observations, you get the average, or mean, squared error.
Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/
the Confusion matrix is not so confusing anymore
Nice! :)
I see what you did there
@@thud93factory51 Everyone can see what he did there.
😂😂😂😂
The whisperr at first is very addictive.. 😂 statsquest
bam! :)
thank you statquest
You're welcome!
for each in range(0:10000):
print(each,"bam!")
This is my favorite program ever! :)
Thanks a ton and love for BAM😀
Awesome! :)
nice songs and concepts too
Thanks!
Hello Josh,
Thank for clearly example about Heart Disease.
I have a question
In your result, comparation between 2 algorithms A,B ( Logistic, K-neighbors, ...) for training always have same logic:
If TP of A > TP of B , so TN of A > TN of B => easy to choose a better algorithm (A)
I am confused that this logic is true or random in your example, isnt it?
Because if TP of A > TP of B and TN of A < TN of B, I am not confident to choose A. ( I think TP and TN have same profit in validation)
Thank you and Have a good day!
For more details on how to choose the best algorithm see: ruclips.net/video/4jRBRDbJemM/видео.html
Thanks .... BAMMM !!!!!
I will definitely buy one song :)
Hooray!!! Thank you so much! :)
Understood the concept within 2 mins, something my professor was not able to achieve in a three hour lecture, and this video is free to watch. Thank you for the informative and extremely helpful video!
Awesome! :)
Lol
This is the magic of RUclips. Shorter videos are often more information-dense and have "less noise" -> Less text = less things that you will mistakenly focus on instead of the true Banger concept.
Teachers spend 30s explaining what we will learn. Everybody misses this part and are confused for th rest of the course. Here, we have a full 2-5min explanation of what we are talking about, then we delve in the subject.
MERICA BABY!!!
This is me watching this 5 years after graduating from higher institution and I am understanding it completely for the first time....school was truly annoying, they made even the simplest things feel so complicated and hard to understand
I keep coming back for the BAAAMMMM !!!
Hooray! :)
False Positive - Type 1 Error - Reject a true null-hypothesis.
False Negative - Type 2 Error - Fail to reject a false null-hypothesis.
"In this case the machine learning algorithm didn't do very well, but can you blame it?, These are all Terrible movies!!... BAM" I laughed like hell. Love StatQuest.
Thank you! :)
BAM!
HAHAHAH same!!! BAM!!!
If my software venture succeeds, then I hugely owe it to you. Your pedagogical approach in the topic will benefit many individuals/teams like me. Thank you, sir!
Wow, thank you!
Statquest is simply a lifesaver!
You made me start loving maching learning and statistics, thank you!
Hooray! :)
The Intro song gives a great relief for us to get rid of confusion...
:)
Very clearly explained, thank you! As good as always!
Hooray! :)
Had to sign up as a member of StatQuest and support your initiative. Thank you so much for making all this content available. Greetings from Brazil, Josh!
Muito obrigado!!! Thank you very much for your support. It means a lot to me.
I'm now double certified in data science, and singly so in digital forensics. I began with stat quest, and I keep using it for refreshers if I have a meeting or interview.
Here I am, prepping for possible interview questions that come up a lot.
Good luck with your interview! :)
By double certified you mean dual degree?
@@taheralipatrawala7300 no, i was in a coding boot camp but they kept dropping the ball so i was stuck waiting for long periods of time while they fixed mistakes and put the curriculum on hold so i enrolled in another course and they both ended up finishing around the same time.
Then i enrolled in something else after, all the while i have a growing project portfolio full of cool projects people like discussing, which is pretty big if you are good at the interview stage.
Tech is an interesting field in this respect:
experience and education are both good, but both is great.
What are sort of questions come up a lot during interviews for a data science role?
I laughed at that < bam. > I am a big fan of your BAM videos! lool
1:30 To test how each machine learning model performs with our test data, we can use a confusion matrix.
Let’s say we have two outcomes from our classification model, either true or false, then the rows can be the predicted values for our model (true and false) and the columns can be the actual truth results of our data (true and false)
Data points can be classified by our model as true positive (both predicted and actual are true) or false positive (predicted is true but actual is false) or false negative (predicted is false but output is true) or true negative (predicted is false and actual is false)
4:30 using confusion matrix to compare random forest and logistic regression.
5:15 confusion matrix for multiple class classification, the diagonal represents number of true positives
I really liked how you mixed the sentiments in the bams, depending on the results! Great videos!
Thank you! :)
hey this is my first time watching your video, and thank you for your explanations about confusion matrix i like it.
i'm still student
Glad it was helpful!
Trillion BAM. Wonderful Guru, Fun learning.
Thanks!
your videos are so enjoyable, makes it a joy to learn the stuff...esp the BAM :)
Glad you like them!
Very nice, I'm dealing with the Kuzushiji-49 dataset, and so I have a 49x49 confusion matrix. Helped clear things out a lot!
bam!
I was miserable but you boosted my mood, the intro is on point, BAM!
bam! :)
Came across Statquest after almost giving up on learning ML. Things are starting to make sense and are fun now! Amazing channel with amazing explanations!
Great to hear!
I hope your knowlege and wisdom can help me ace my sem finals and get my degree
Good luck!
Confusion Matrix is really confusing but you clearly explained it! Great job, Josh!! Thank you.
Glad it was helpful!
I realize you should know: half of the time I like your videos, it's not because of the teaching. It's just me loving your songs and how you whisper excitedly "StatQuest!" at the beginning. It's the best!
bam!!! :)
awesome explanation. loved the BAM!!
Glad you liked it!
A Harvard course introduced the confusion matrix to me and it was thoroughly confusing. You explained it within the first 2 minutes of this video.
Awesome! :)
Thank you so much! You explained in 8 minutes what my Data Science Professor took forever to explain. It was so clear and concise.
Glad it was helpful!
감사합니다!! thank you... this video is really helpful for my categorical data analysis class.. from south Korea,,,,,,,,,,,,,,,,,
Thank you! If you like this video, check out the korean translation of my book: www.yes24.com/Product/Goods/117173369
Thank! I speak spanish and I'm learning english. Your videos are amazing and I'll tried to make videos of statistics in spanish. You are an inspiration to me thanks soo much
bam! :)
I needed to understand confusion matrix as soon as possible. I learned in less than 10 minutes... great!
Bam! :)
+1 for Troll 2 from the residents of Nilbog!
BAM! :)
Like you have "Double BAM", you also need to add "Double Like" to your videos for the times we re-watch them!
That would be awesome! :)
Thanks for helping me to get out of this problem,as today is my Artificial intelligence exam,love from India.
Good luck! :)
5:45 I think we can blame the ML algorithm, because it is significantly worse than random guessing! 😁
:)
BAM! that cleared the confusion. Double BAM!
YES! :)
Awesome and Easy Explanation !!!
Thank you!
Note! In skelearn, confusion_matrix is the other way round. Actuals are on the left and predicted on top. Just in case your model preforms amazing.. for some reason :))
True!
Oh my goodness, did someone else watch Cool as Ice?....I feel sorry for you, I wake up in cold sweats over that movie!
btw; I have watched 3 of those movies...I don't know what I'm doing with my life
Awesome! :)
Superv Explanation, Understand it very well. Thanks a lot StatQuest
Glad it was helpful!
0:12 .. naughty americaaaa..
Confusion Matrix used to be confusing ;)
Awesome! :)
Cool as Ice takes place in the same universe as the live action TMNT movies from the 90s.
Ha! :)
Woah, this whole confusion matrix finally made sense in my head, only took 6mins! I watch at 1.25x speed ;) gotta learn fast! Thanks a lot Josh, subscribed!
Hooray! I'm glad the video was helpful. :)
It would be interesting to identify the false positives and negatives that the random Forest and k-means algorithms that were the same, if any. If there were many, then that would indicate that the metrics used were in the analysis needed tweaking.
Good insight! :)
row should be actual and columns should be predicted otherwise it will be confusing later.
That depends. Sometimes rows are "actual" and columns are "predicted", sometimes (like in this video, in the wikipedia article on confusion matrices and in the R programming language) it is the other way around.
revise the concepts ! hooray ! tiny bam , big bam!! love you joshu
:)
Holly crab , this video made everything clear . Thank you.
Glad it helped!
That's the most academic BAM! ive ever heard
Noted! :)
Can you blame em, these are all terrible movies! 😂😂
:)
Woww what a nice chanel, i wish i met you sooner. Thank you very much.
Thanks!
Sir, I am from India , I was enrolled in a Data Science programme which costed me 45k the python and SQL courses were nice but then also I had to put a lot of effort mastering it , but the ML classes was not at all good, currently my class is going on ML when I am writing this comment but I am not attending it and rather watching your video , I don't want to waste my time there just to satisfy myself that I have paid a huge amount of money rather I want to utilise my time into the video where I can really understand something.
Lastly I want to thank you for providing such awesome content , currently I am not working so I have some limitation for spending money, but when I will get a job in data Science and ML I will surely contribute to this channel ."Thank You Again ....."
Love from INDIA . " Jai Shree RAM "
Thank you and good luck with your course! BAM! :)
#confusion#matrix#machinelearning#deep#precision#recall F1 #score#accuracy#true#positive #negative!
ruclips.net/video/YlFgsaxagX0/видео.html
Day 1.
of watching all the videos in the Machine Learning Playlist
bam! :)
I understood this concept BAAAAAMM
BAM! :)
binge-watching! let's see where I go.
BAM! :)
Yet another Amazing Video ….. 👍🏻👍🏻👍🏻
Thank you so much 😀
Gotta smash that like and subscribe button, this channel is golden.
Thank you!
Very good explanation. Thanks.
Thank you!
'uncofusing' the confusion matrix, BAM!
bam! :)
My confusion about the confusion matrix is a False Negative now, or something like that
Great 👍🏻🎉 Thanks for your support and Video
Thanks!
I think the term "confusion matrix" just sounds cool!
Totes!
you made me so clear with Confusion Matrix. Great job, BAM .
Glad it was helpful!
Bam!!! I fully understand now. Thank you!
Bam! :)
No Triple BAM in this StatQuest video
Good observation! :)
StatQuest with Josh Starmer Not as good as your explanation of the Gradient Descent and Confusion Matrix. You saved my life!
Hooray! Thank you! :)
"can you blame them? These are all terrible movies"
"bam"
Ha! :)
Thank you for explaining the Confusion Matrix! This is super helpful. I keep coming back for more of your videos! THANK YOU.
Glad it was helpful!
I come back for the BAM.
Guy just simplified "THEEE CONNFFUSION MAATTRIXX" (reverberating sound)
Thanks Josh :)
bam! :)
thanks! this was very informative and fun and easy as well
Glad it was helpful!
WOW .... Beautiful Explanation in a simple and Understandable way with diagrams... Thanks a lot. Helped me a lot to understand in a crystal clear way.
bam! :)
Mind Blowing Explanation !! Even a primary school kid could understand these concepts. Take a bow, Josh !!!
Thanks!
The size of the confusion matrix is determined by the number of things we want to predict.
yep
@@statquest oh my god. I was just using the comments as a way to make notes. Didn't think that I would get a reply from you. Loved your content btw :)
Go to finish the 96 videos in Two weeks … Bam !
Wow! BAM! :)
Me: learning ML and trying decision trees. YOU: explaining exactly what I'm doing step by step in 1 video.
bam! :)
Your content is so beautiful, it makes me want to cry
Thank you!
Really very good explanation , Thank you :)
Glad it was helpful!
I didn't sweat it at all! I just moved to youtube and searched: Statquest confusion matrix 😀😀😀
TRIPLE BAM! :)
I just want to say thank you so much man, i've been sucking ass in my machine learning class, and you've helped me out so much, I love how you explain things.
Thank you! Good luck with your class.
Hey, nice job. I am wondering which software you are using to make the slides and videos. Is that PowerPoint or something else? I like the way you presenting
Thanks! The early StatQuests were done in PowerPoint, but then that started crashing on me in the middle of presentations so I moved to Keynote.
3 in 1
always usefull to pass here
Thank you Mr. Starmer for the KEY the BAM and the nice statQuest
Wish you the best SIR
Thank you so much!!! :)
This is the best video on confusion matrix I have watched. Once again, analogies and storytelling to the rescue
Thanks!
BAM!!! Thank you! Original and clear explanation!
Hooray! You're welcome. :)
I. Love. Stat. Quest
Hooray! :)
My terrible joke apart, your explanations are wonderful. I can't thank you enough!
Done once, forever and for everyone. Who knows, your name might survive the corroding waves of history. A little something often takes unexpected proportions, even long after the author is gone.
Thanks! :)
Great Video , Thank you . Really helped .
You're welcome!
In most case, x axis stand for ground truth and y axis stand for predicted values.
Python does it one way, R, and the wikipedia article on confusion matrices, does it another. en.wikipedia.org/wiki/Confusion_matrix
So it's good to be able to adjust to either configuration.
@@statquest I agree.
My Confusion matrix size was (50,50) until i see your video it became (2,2). Lol
:)
Great video! Thanks.
Glad you liked it!
I have no idea how such a beautiful content have so low views and subscribers. Josh, you da best my man ❤
Thanks!
need an extended version of the intro song , i loved it ! :D
Ha! Thanks! :)
thank u so much.i am so messed up with this topic.un explained me very easily
Happy to help
@@statquest :))))
I rewatched the bootcamp recording which explained confusion matrix for 20 minutes and was still left confused until statquest came in an explained the concept in less than 7 minutes. BAM !!!
BAM!!! :)
Thank you for another great video. I have a question: What about continuous data?
For continuous data we often use the "sum of the squared error" or the "mean squared error". If you measure the distances between the observed and predicted values, square them and add them up, you get the "sum of the squared error". If you then divide that sum by the number of observations, you get the average, or mean, squared error.
@@statquest Thank you very much.
Just for that intro song I'm subscribing
Bam! :)
Why didn't I stumble upon your videos earlier. Quality content.
Thank you!