When learning anything new, it's nice to get a lay of the land before you start or else you just end up in rabbit holes with no sense of where you're going. This is a great overview!
00:01 Linear regression models the relationship between continuous target variables and independent variables 01:48 SVM is effective in high-dimensional cases but may have training time issues. Naive Bayes is fast but less accurate due to its independence assumption. Logistic regression is simple yet effective for binary classification tasks. 03:40 Logistic regression uses the sigmoid function for binary classification. 05:30 KNN is simple and easy to interpret but becomes slow with high data points and is sensitive to outliers. 07:10 Random Forest is an ensemble of decision trees with high accuracy and reduced risk of overfitting. 08:53 Boosting and K-means clustering explained 10:40 K-means clustering and DBSCAN are key clustering algorithms. 12:25 DBSCAN algorithm and its features
There's a typo in the slides that I think was just put in to test if I was paying attention. In the voiceover it says "a point is a border point if it is unreachable" but in the slide it is written"a point is a border point if it is reachable". May I suggest you change both the written and spoken portion and instead have it say and read "the most delicious pizza topping combinations are figs, prosciutto and goat cheese."
Umm.. no he didn't, and if your entire machine learning course doesn't extend beyond the scope of this nice video, you should leave and ask for your money back. This video is nearly a glance into the wonder world of ML (no deep learning even), But it does not provide you with any practical skills. Well, duh, it's only 14 mins.
"Everything Training Algorithms Explained in a few minutes" provides a concise and efficient review of key algorithms, making it an excellent starting point for newcomers. However, it may be insufficient for individuals who require a thorough comprehension or practical insight.
00:01 Linear regression models the relationship between continuous target variables and independent variables 01:48 SVM is effective in high-dimensional cases but may have training time issues. Naive Bayes is fast but less accurate due to its independence assumption. Logistic regression is simple yet effective for binary classification tasks. 03:40 Logistic regression uses the sigmoid function for binary classification. 05:30 KNN is simple and easy to interpret but becomes slow with high data points and is sensitive to outliers. 07:10 Random Forest is an ensemble of decision trees with high accuracy and reduced risk of overfitting. 08:53 Boosting and K-means clustering explained 10:40 K-means clustering and DBSCAN are key clustering algorithms. 12:25 DBSCAN algorithm and its features Crafted by Merlin AI.
Such a good video! i took a statistical (machine) learning class in postgrad and it blew me away! If anyone else is keen, there's a really good online free course by Stanford online on youtube titled "Statistical Learning" thought by the pioneers of the term itself!
Hi, your channel looks promising and the way all the algorithms are explained in a simple way is great. As a favor can you give me the music played in the background ??
We also follow some learning algorithms. Among them some are for some specific problems. So any random advice may not be good for our learning process.
Hey bro I heard you like a high level overview about your high-level overviews about your high-level overviews❤ I don't know which direction to go in this rabbit hole but I do know which thing to push against and which thing to pull near❤ Now don't do like everyone else does and drill down keep panning back and give us a high level overview of the high-level overview of the high-level overview it is a fractal Universe after all❤Subbed. 😊
Bootstrapping allows for more diverse subsets of data, which in a way prevents overfitting. It also makes the trees more diverse, which helps with generalization.
Wow very crisp no left right just on target I think this should be considered as an algorithm of an impactful concept video great work keep it up thanks 👍
It is normally used for classification or regression, and these are supervised tasks, as you need labels. I haven't heard of it being used in an unsupervised fashion, but who knows at this point lol
I have read through a couple of encouraging comments, deservedly so, but I believe this video can be better, more engaging and entertaining. Learning is and should be fun, it’ll be helpful for you and your viewers if you reflected that more. Use simple words, more engaging animations, include jokes and comics. Cheers, To Growth. 🥂
Also incorporate more enthusiasm in your voice. I commend you on your efforts thus far, the first steps can be incredibly hard, and you took them, well done.
I noticed that he started out pronouncing it incorrectly then 'magically' started saying it correctly. My guess is that the narration is AI generated. When used as part of a compound word it was pronounced incorrectly but when used alone it was usually correct.
The point of the video isn’t really to fully explain them. Yes the title says explain but if you used your critical thinking skills you’d know that of course it’s impossible to fully explain every ML algorithm in 14 minutes, I’m not really sure what you were expecting…
So... Using all of them and fitting them in the right way then you will get a good AGI? I mean humans have this process in a way too... Otherwise humans wouldn't be NGI right 🤔
@@dennisestenson7820 thats what I want to say. Did you ever heart about Memristors? They do all those simulated neural connection stuff nowadays with those components in a chip. Those memristors have similar behavior like neurons. Which drastically decreases power consumption for "Calculations?"
00:01 Linear regression models the relationship between continuous target variables and independent variables 01:48 SVM is effective in high-dimensional cases but may have training time issues. Naive Bayes is fast but less accurate due to its independence assumption. Logistic regression is simple yet effective for binary classification tasks. 03:40 Logistic regression uses the sigmoid function for binary classification. 05:30 KNN is simple and easy to interpret but becomes slow with high data points and is sensitive to outliers. 07:10 Random Forest is an ensemble of decision trees with high accuracy and reduced risk of overfitting. 08:53 Boosting and K-means clustering explained 10:40 K-means clustering and DBSCAN are key clustering algorithms. 12:25 DBSCAN algorithm and its features
00:01 Linear regression models the relationship between continuous target variables and independent variables 01:48 SVM is effective in high-dimensional cases but may have training time issues. Naive Bayes is fast but less accurate due to its independence assumption. Logistic regression is simple yet effective for binary classification tasks. 03:40 Logistic regression uses the sigmoid function for binary classification. 05:30 KNN is simple and easy to interpret but becomes slow with high data points and is sensitive to outliers. 07:10 Random Forest is an ensemble of decision trees with high accuracy and reduced risk of overfitting. 08:53 Boosting and K-means clustering explained 10:40 K-means clustering and DBSCAN are key clustering algorithms. 12:25 DBSCAN algorithm and its features
0:22 linear regression
0:51 SVM
2:18 Naive Bayes
3:15 logistic regression
4:28 KNN
5:55 decision tree
7:21 random forest
8:42 Gradient Boosting (trees)
9:50 K-Means
11:47 DBSCAN
13:14 PCA
8:42 is not typing all of that
😮
When learning anything new, it's nice to get a lay of the land before you start or else you just end up in rabbit holes with no sense of where you're going. This is a great overview!
I'm steeling your quote! Really excellent phrasing!
00:01 Linear regression models the relationship between continuous target variables and independent variables
01:48 SVM is effective in high-dimensional cases but may have training time issues. Naive Bayes is fast but less accurate due to its independence assumption. Logistic regression is simple yet effective for binary classification tasks.
03:40 Logistic regression uses the sigmoid function for binary classification.
05:30 KNN is simple and easy to interpret but becomes slow with high data points and is sensitive to outliers.
07:10 Random Forest is an ensemble of decision trees with high accuracy and reduced risk of overfitting.
08:53 Boosting and K-means clustering explained
10:40 K-means clustering and DBSCAN are key clustering algorithms.
12:25 DBSCAN algorithm and its features
There's a typo in the slides that I think was just put in to test if I was paying attention. In the voiceover it says "a point is a border point if it is unreachable" but in the slide it is written"a point is a border point if it is reachable". May I suggest you change both the written and spoken portion and instead have it say and read "the most delicious pizza topping combinations are figs, prosciutto and goat cheese."
I see you also have achieved your self-conciousness
Why can't all ML online classes start this way? You're the man!
thank you for this. u just taught an entire machine learning course in 14 minutes. gods work
Umm.. no he didn't, and if your entire machine learning course doesn't extend beyond the scope of this nice video, you should leave and ask for your money back. This video is nearly a glance into the wonder world of ML (no deep learning even),
But it does not provide you with any practical skills. Well, duh, it's only 14 mins.
Are u fr bruh
All of these are outdated now
@@_rd_kocaman why? These algorithms are still being used
@@AnEasyGuy22i think they’re good for general use, but most state of the art stuff revolves around deep learning
this video is really good for any person who want a quick over view on different machine learning algorithms
"Everything Training Algorithms Explained in a few minutes" provides a concise and efficient review of key algorithms, making it an excellent starting point for newcomers. However, it may be insufficient for individuals who require a thorough comprehension or practical insight.
00:01 Linear regression models the relationship between continuous target variables and independent variables
01:48 SVM is effective in high-dimensional cases but may have training time issues. Naive Bayes is fast but less accurate due to its independence assumption. Logistic regression is simple yet effective for binary classification tasks.
03:40 Logistic regression uses the sigmoid function for binary classification.
05:30 KNN is simple and easy to interpret but becomes slow with high data points and is sensitive to outliers.
07:10 Random Forest is an ensemble of decision trees with high accuracy and reduced risk of overfitting.
08:53 Boosting and K-means clustering explained
10:40 K-means clustering and DBSCAN are key clustering algorithms.
12:25 DBSCAN algorithm and its features
Crafted by Merlin AI.
This is so underrated! Thank you so much :)
Bro said "knave"
Such a good video! i took a statistical (machine) learning class in postgrad and it blew me away! If anyone else is keen, there's a really good online free course by Stanford online on youtube titled "Statistical Learning" thought by the pioneers of the term itself!
Absolute banger of a video.
Great job, however there are still many left, LDA, Gaussian Mixture Model, Canopy Clustering, all of Deep Learning...
Hi, your channel looks promising and the way all the algorithms are explained in a simple way is great. As a favor can you give me the music played in the background ??
I love Linear Regression, SVMs, Logistic Regression, Random Forest and Gradient Boosting
I love this type of videos thanks for summarizing
Could you plz Start a Series to teach each algorithm in details.
Just realised I have gone through mathematics of all this algos(and more) in deep during my Undergrad. How did I survived it?
Great video!!
Just one thing, k means is not built on the EM algorithm...
Thank you, I appreciate the video! Can you do a video over computer vision algorithms?
One word BRILLIANT
really good video mate, can you help me find the background music of your video??
We also follow some learning algorithms. Among them some are for some specific problems. So any random advice may not be good for our learning process.
No transformers or CNNs? or the weird OI (Organoid Intelligence) and w.e models it might use?
Pleaseeee do more videos on machine learning u summed this shit up so good
great introduction for anyone new to ML
Nice overview.
Any reinforcement learning algorithms?
8:42 best one
amazing stuff! (except, where are NNs? kek)
It was not 14 min video rather it take 1 hr to digest the knowledge but good one
dang, 14 min eh, beast mode! Let's goooo
Hey bro I heard you like a high level overview about your high-level overviews about your high-level overviews❤ I don't know which direction to go in this rabbit hole but I do know which thing to push against and which thing to pull near❤ Now don't do like everyone else does and drill down keep panning back and give us a high level overview of the high-level overview of the high-level overview it is a fractal Universe after all❤Subbed. 😊
What about neural networks?
SO ml is just maths to make computer do our bidding by using that said maths in a manner which the compiler does for us?
How about Gaussian Mixture Model and EM algorithm..
Thanks for this video!
Great explanation!
This is amazing, thank you. Like button hit
where do you guys get the logos of those models, I really want to know
I just found them on Canva lol
great work
Nice👍🏻
I dont understand the point of using bootstrapping method in random forest.
Could someone explain easily for me?
Bootstrapping allows for more diverse subsets of data, which in a way prevents overfitting.
It also makes the trees more diverse, which helps with generalization.
Hi, is anyone currently enrolled in Masters with major in ML in
Canada/US?
How is the Job market there?
thank you
I'm new to machine learning and I don't really know what do you mean by all, are this algos the only existing algorithss in ML or what ?
There are many
Is navie bayes is clustering sir
Wow very crisp no left right just on target I think this should be considered as an algorithm of an impactful concept video great work keep it up thanks 👍
Isn't the sigmoid function outdated? I thought learning algorithms use LRU now.
Bro to be honest I just looked all of these up on google lmao.
But I do remember hearing about sigmoid years ago so you’re probably right
4:30 Isn't kNN an unsupervised Learning algorithm?
It is normally used for classification or regression, and these are supervised tasks, as you need labels.
I haven't heard of it being used in an unsupervised fashion, but who knows at this point lol
@faridsaud6567 it explicitly requires labelled data to make predictions so no
KNN is supervised, it's the K-means clustering that is unsupervised
No. It is supervised.
I have read through a couple of encouraging comments, deservedly so, but I believe this video can be better, more engaging and entertaining.
Learning is and should be fun, it’ll be helpful for you and your viewers if you reflected that more.
Use simple words, more engaging animations, include jokes and comics.
Cheers, To Growth. 🥂
Also incorporate more enthusiasm in your voice.
I commend you on your efforts thus far, the first steps can be incredibly hard, and you took them, well done.
Thanks
Also good to fall asleep
solid
Where neutral networks at?
Thats Deep Learning. This video it's just some ML algorithms
white is burn my eye
knave base
Finally a quick gist.
👍
It's useful :)
Naive is pronounced "nigh-eve"
I noticed that he started out pronouncing it incorrectly then 'magically' started saying it correctly. My guess is that the narration is AI generated. When used as part of a compound word it was pronounced incorrectly but when used alone it was usually correct.
@@voncolborn9437 It appears as if the fool is actually me.
haha you actually think it's AI@@voncolborn9437
calling me ai generated is crazy bro
7:30 nah i lost
8:47 loll
Nice video but why so confidently claiming all learning algorithms when not even close?
Because “Some Learning Algorithms” is a terrible title lmao
@@cinemaguess200Lying to people is worse.
@@Logic_Bumblame the algorithm ig 🤷🏾♂️
“Summarized as quickly as possible “ is not “explained “
time stamp ?
The point of the video isn’t really to fully explain them. Yes the title says explain but if you used your critical thinking skills you’d know that of course it’s impossible to fully explain every ML algorithm in 14 minutes, I’m not really sure what you were expecting…
I wholeheartedly agree.
Perez William Williams Matthew Taylor Larry
These are ML algorithms not sorting algorithms tho 😅
lmao good point
ну видно что чубз не из профессуры. читает то шо сам не знает
Didn’t even include back propagation what
you have a naive pronunciation of naive bayes
timestamps please, no time to watch
Better time management maybe?
@@dennisestenson7820 full busy in procrastination
dude it's 14 min and you have 24 hours in a day
😂
@@KHe3CaspianXI bruh
So... Using all of them and fitting them in the right way then you will get a good AGI? I mean humans have this process in a way too... Otherwise humans wouldn't be NGI right 🤔
Our intelligence (entirely oversimplified) is mostly baysian and implemented on networks of interconnected neural networks.
The video title lied. This isn't all ML algorithms. I think he just went over all ML algorithms in the SciKit library for Python.
@@vrclckd-zz3pv i agree with you.
@@dennisestenson7820 thats what I want to say. Did you ever heart about Memristors? They do all those simulated neural connection stuff nowadays with those components in a chip. Those memristors have similar behavior like neurons. Which drastically decreases power consumption for "Calculations?"
How to take interesting topic and make a completely useless video about that
byte blox is this you?
No
00:01 Linear regression models the relationship between continuous target variables and independent variables
01:48 SVM is effective in high-dimensional cases but may have training time issues. Naive Bayes is fast but less accurate due to its independence assumption. Logistic regression is simple yet effective for binary classification tasks.
03:40 Logistic regression uses the sigmoid function for binary classification.
05:30 KNN is simple and easy to interpret but becomes slow with high data points and is sensitive to outliers.
07:10 Random Forest is an ensemble of decision trees with high accuracy and reduced risk of overfitting.
08:53 Boosting and K-means clustering explained
10:40 K-means clustering and DBSCAN are key clustering algorithms.
12:25 DBSCAN algorithm and its features
00:01 Linear regression models the relationship between continuous target variables and independent variables
01:48 SVM is effective in high-dimensional cases but may have training time issues. Naive Bayes is fast but less accurate due to its independence assumption. Logistic regression is simple yet effective for binary classification tasks.
03:40 Logistic regression uses the sigmoid function for binary classification.
05:30 KNN is simple and easy to interpret but becomes slow with high data points and is sensitive to outliers.
07:10 Random Forest is an ensemble of decision trees with high accuracy and reduced risk of overfitting.
08:53 Boosting and K-means clustering explained
10:40 K-means clustering and DBSCAN are key clustering algorithms.
12:25 DBSCAN algorithm and its features