Maybe i'm wrong but I think the hypothesis is not that X1 and X2 are independant but that X1 and X2 are conditionnaly independant. It was very clear otherwise thank you !
I think the hypothesis is that you assume each feature to be (w.r.t other features) 1) globally independent (in the global sample space) 2)conditionally independent w.r.t the occurence of each class label (under the subset sample space where the particular class event has occured) If these assumptions are not met, then it does not seem possible to build the mathematics, because as far as I see, if events A and B are independent, that does not naturally imply conditional independence between events (A|C) and (B|C)
In the last part of the video you said we can fit a known distribution to a continuous set of data. However, you continued to then write that the probabilities can be calculated by taking the product of the pdf evaluated at different values of the feature and label. The pdf does not provide probabilities however, as it needs to be integrated to inform one of the probabilities of an event. This part of the video seems imprecise. However, the video in general was great. Thanks.
Ok, I've given up on the video after 45 secs. You said "stated clearly", if you hadn't I'd have kept watching. You point to an array of features called X. What are they? Are they features of the array itself (its size / rank / dimension?), are they features of the thing the array of describing (measurements in a house?), or a list of possible attributes (the ingredients on a pizza?) Then you introduce a label. So what, is this like a python dictionary? Plus, I've no idea what sort of issue we're supposed to be tackling? Is it probability? Is it rationality with limited knowledge? I only guess that because I've heard of Bayes before. Instead you launch into calculations when I have not the first idea what you're calculating. Why would I listen to that? Tell you what, I'll give it another 30 secs. If there's no illustrative example / clear explanation of what the hell you're covering I'm gone.
If I search for any ML Algorithm I just first check your channel If you have created the video on the same... You are my first preference for ML/DL Algo Explanation. Just a request please make a video on Deep Learning Algorithm too like CNN, RNN & LSTM "from scratch". It will really help people who want to become practitioners in AI like me.
9:37 you made conclusion based on P(X=[0,2] | Y), I think the correct way is to calculate P(Y|X=[0,2]). In case P(Y=1) is very small, the answer can be Y=0.
The explanation is so cool! But it would be even cooler if you added some examples with continious features and fitting a distribution, this part wasn't so clear...
'Clearly Explained' - and it actually was. Thanks man
:D :D
This was a very clear explanation indeed. Thank you!
You're very welcome!
Dude.. I lost count of the videos I watched to understand this but lastly, after seeing your video the struggle ended. Thank you so much!
One of the best explanations I've ever seen!
Thanks mate! Keep supporting...
How he manage to explain something that a 1-hr lecture couldn't! Thanks mate
Maybe i'm wrong but I think the hypothesis is not that X1 and X2 are independant but that X1 and X2 are conditionnaly independant. It was very clear otherwise thank you !
In naive Bayes every feature is treated as an independent feature that's why it's called naive.
I think the hypothesis is that you assume each feature to be (w.r.t other features)
1) globally independent (in the global sample space)
2)conditionally independent w.r.t the occurence of each class label (under the subset sample space where the particular class event has occured)
If these assumptions are not met, then it does not seem possible to build the mathematics, because as far as I see,
if events A and B are independent, that does not naturally imply conditional independence between events (A|C) and (B|C)
LOVED IT!!!
Awesome Explanation! Can't thank you enough...
In the last part of the video you said we can fit a known distribution to a continuous set of data. However, you continued to then write that the probabilities can be calculated by taking the product of the pdf evaluated at different values of the feature and label. The pdf does not provide probabilities however, as it needs to be integrated to inform one of the probabilities of an event. This part of the video seems imprecise.
However, the video in general was great. Thanks.
10:51 How does this work? wouldn't the probability that Xi = xi be zero, given we're using a continuous distribution? Because of the "=" sign
Very clearly explained, thank you!
sir please more lectures.
I am seeing after too days later your lectures
made some advance NLP and CV lectures or AI lectures thanks
I will try my best to upload more frequently.
so goood
Sir is this like
Bayesian classifier deals with conditional probability ?
Naïve bays classifier deals with joint probability ?
Thanks in advance.....
Yeah!
You areee Amazing. I love your Indian Bengali accent ( just a guess hehe make me a voice analyzer if i am right XD
)
You did good my friend. I'm glad I came across this video
Nice video! Thank you.
Ok, I've given up on the video after 45 secs. You said "stated clearly", if you hadn't I'd have kept watching.
You point to an array of features called X. What are they? Are they features of the array itself (its size / rank / dimension?), are they features of the thing the array of describing (measurements in a house?), or a list of possible attributes (the ingredients on a pizza?) Then you introduce a label. So what, is this like a python dictionary?
Plus, I've no idea what sort of issue we're supposed to be tackling? Is it probability? Is it rationality with limited knowledge? I only guess that because I've heard of Bayes before.
Instead you launch into calculations when I have not the first idea what you're calculating. Why would I listen to that?
Tell you what, I'll give it another 30 secs. If there's no illustrative example / clear explanation of what the hell you're covering I'm gone.
Nope, 30 secs later and it's absolute horseshit.
Great explanation :)
I just love this explanation
If I search for any ML Algorithm I just first check your channel If you have created the video on the same... You are my first preference for ML/DL Algo Explanation. Just a request please make a video on Deep Learning Algorithm too like CNN, RNN & LSTM "from scratch". It will really help people who want to become practitioners in AI like me.
Thank you so much ❤
Writing CNNs and RNNs from scratch are pretty hectic...maybe some day I'll try.
@@NormalizedNerd Waiting... you are our only hope who can teach us Mathematics of ML with cool animation, That's why requested you! Thanks.
It was like a revision for class 12 probability 😁😁
Yeah simple yet effective concept.
Good work with the visuals!!
9:37 you made conclusion based on P(X=[0,2] | Y), I think the correct way is to calculate P(Y|X=[0,2]). In case P(Y=1) is very small, the answer can be Y=0.
Well explained, a quick revision for Naive bayes. I forgot why it was called Naive until i watched this video 😂😂
Thanks! Haha.
Yes, this was actually well explained. Thank you :)
just quit confusing people
Thank you so much man!!
very helpful🥺🥺
Hello people from the future! :D
Great video man great
herre is a sub
The explanation is so cool! But it would be even cooler if you added some examples with continious features and fitting a distribution, this part wasn't so clear...
and how aboit gaussian NB?
I am appreciate your work
Thanks a lot!
Love this
10:37
Thank u it was great.
HUGE thanks for perfectly delivering the whole concept in one video bro!!
Well done👊👊
you saved me
liked that
Thank you!
Nice ⭐⭐⭐⭐⭐
HELPFUL!!!!
Thank you very much for the video. Clearly explained indeed, the only part I couldn't get completely was the discretization.
in the last part at minute 11: What is the function f to fit a known distribution? Thank you for answering!
hey, thanks man, very clear explanation.😀😀
3b1b's bro is here
Haha :3
Amazing video. thanks.
very nice explanation thank you so much
Thank you very much for your work! Nice explanation!
You are welcome!
Very nice explanation and perfect illustrations!!
Really good work, congrats
Thanks man!
I love this Exolaination 😍🥰😘
Thanks a lot ❤
thumbs up
thanks
I wish you were my professor
Awesome! Thank you.
independant moment
Great explanation
great explanation
love this!
well done
That was great! I'm really glad that I found your channel. Thanks a lot 👍👍
It was clearly explained as mentionned in the title. Thanks a bunch !!!
Nicely explained!
Very clear explanation!
This is really well explained.
bro, best explanation I could find
Thanks bro :)
Great explanation.
Glad it was helpful!
truly amazing
Thanks!
Amazing teaching skills