Who are these people who disliked this video? Andrew Ng is an amazing teacher and anything he says is automatically absorbed by brain and understood for good. I am a fan of his teaching.
There will always be people that want to provoke or are angry for some totally unrelated reason (I have one such colleague, truth is not important for him). Most probably has nothing to do with the contents of this video.
Yes, but in general there isn't a closed form solution for the coefficients of a logistic regression model (like there is in linear regression). In practice, we find the MLE using one of several numerical methods such as iteratively reweighted least squares. It's not wrong to use gradient descent, although it may not be the most efficient.
He is an amazing teacher. When I have trouble understanding any concept I listen to his explanation for the concept
Who are these people who disliked this video? Andrew Ng is an amazing teacher and anything he says is automatically absorbed by brain and understood for good. I am a fan of his teaching.
There will always be people that want to provoke or are angry for some totally unrelated reason (I have one such colleague, truth is not important for him). Most probably has nothing to do with the contents of this video.
dislike!!
yeah!!!
I think it is people who know that their jobs will be replaced by artificial intelligence :)
I wish Andrew Ng would make some more videos and MOOCs on new material. He's the best teacher!
Thank you Prof. Ng, this video helped me a lot.
Andrew is the best teacher for LM. thanks for this posting
Finally I got what Naive Bayes algorithm does, thank you so much :)
very clear concise and simple explanation
best ML prof I ever met
Awsome teacher, such a great and insightful explanation!
Waiting for another Coursera Course by Sir Andrew for Machine Learning..
The title is misleading, naive Bayes isn't included in the video.
Are you sure it is legal to share this video to youtube??
Does anybody know where this video is from, it looks similar to his Machine Learning course on Coursera but it isn't there?
These videos are earlier than Coursera. So you cannot find them in Coursera. I have provided the link below the video.
Andrew goes on to say that we fit a logistic regression line using gradient decent (at 1:35). Isn't this done using maximum likelihood ?
Yes, but in general there isn't a closed form solution for the coefficients of a logistic regression model (like there is in linear regression). In practice, we find the MLE using one of several numerical methods such as iteratively reweighted least squares. It's not wrong to use gradient descent, although it may not be the most efficient.
Oh, thanks for the clarification.
Thanks for sharing these!
how do we get P(x|y)? thanks
The voice can be higher and clearer. The explanation is good.
perfect