A very good job... Can't Thankyou enough... Instead of going slow one video per week you can go faster and complete the whole machine learning series.... You are the only person on RUclips who is very precise with the fundamentals.... There are others who are into machine learning but they just focus in how to call the model and get your work done which is not the case with you...
Thank you so much. You gave a very valuable feedback to me. Your comment made me very happy. And I will do my best to upload videos faster, but I am usually busy with my other daily studies and activities, so I don’t get enough time. But if i am free for few days then can upload more videos.
Thank you so much for this compliment. I will keep uploading more videos. I have already planned for my next uploads. Hope you find my other videos helpful as well.
Awesome explanation!! Many people explain Logistic regression in a very complex way. But, you explained it in a very easy and structured manner, which was very easy to understand and very helpful in summarizing. Thanks for this video. Please keep posting such amazing videos.😄
3:08 i understand that the sigma curve gives a more accurate description of the data beeing 0 or 1 (or later in prediction the propability of it beeing 0 or 1). but as we have a binary problem the only important part of the curve is the point where 0,5 is crossed. So the linear curve does exactly the same job as the sigma curve when it is fitted so that it crosses at the same point. Is the sigma curve in this simple example still better for some reason I dont get?
very solid explanation, dude! thanks a lot! i combine ur videos with coursera tutorials as well :) As long as I am not good at math, I have a question what does b represent in the formula W^t+b? Thanks a lot!
hey what is B in the equation? error? i was watching Andrew Ng course and it gets confused sometimes, but you explained so well what all those functions means, letter by letter, i just missed the B value, is ist same as error ? bias? ... thanks mate. big shout out from brazil
Hi jay! thanks for your videos!. I had one doubt. Why is logistic regression a linear classifier when sigmoid function is a non linear function? when we couple the same logistic function and couple it in nueral network it is said to form a non linear classifier.
hi bro, thanks a lot for this amazing valuable video, I have a question which is how we can find the weight?? can you help by an example or a reference that explain in a simple way??
why we applied the straight line in sigmoid curve as we substitute x by (wx+c), why we dont just stick with sigmoid function to make the predictions????
finally someone who teaches mathematics behind it and not just an introduction
Hehe… thanks!
I watched a lot of videos and read a lot of articles, none explained the concept as good as you. Thanks. And keep it up.
Hi, means a lot to me... thank you much. Glad I could help :)
A very good job... Can't Thankyou enough... Instead of going slow one video per week you can go faster and complete the whole machine learning series.... You are the only person on RUclips who is very precise with the fundamentals.... There are others who are into machine learning but they just focus in how to call the model and get your work done which is not the case with you...
Thank you so much. You gave a very valuable feedback to me. Your comment made me very happy. And I will do my best to upload videos faster, but I am usually busy with my other daily studies and activities, so I don’t get enough time. But if i am free for few days then can upload more videos.
i had been struggling until I found your vid. Thankyou so much.
Thank for mathematically explaining Logistic Regression. It finally makes sense
If you found value from this Video, then hit the like button 👍, and don't forget to subscribe ⭐. I always love your support 🤗
amazing video please don't stop posting you are helping so many people out! you have no idea!!! Thank you!
Thank you so much for this compliment. I will keep uploading more videos. I have already planned for my next uploads. Hope you find my other videos helpful as well.
fantastico, very didactic, clear, graphs, speed up in writing, all perfect
Thanks! Very glad to hear this 😄
Thanks a lot for This Introductory Lecture 🙂
Lecture - 1 Completed ✅ in Logistic Regression Machine Learning
Awesome explanation!! Many people explain Logistic regression in a very complex way. But, you explained it in a very easy and structured manner, which was very easy to understand and very helpful in summarizing. Thanks for this video. Please keep posting such amazing videos.😄
Glad i could help 😇
Thank you for clearing my doubts on logistic regression..was struggling from long time.. 🙏🏻☺️
Happy to help. Glad it was useful for you! 🙂
Great explanation bro..........Your video is playing in Australia in top most collage lecture. lecturer's gave a references of your videos.
Wow… its an honour… thank you so much 😊
too much excellent , i am doing ML and actually love math and you give a very good explanation of this , thanks you so much !
Glad you found it valuable 😇
much love for the mathematics in this video! appreciate it a lot from a stat major
Thank you sooo much for making this simple to understand
You're very welcome!
Thanks a ton. You explain really well.
You’re welcome
you are doing well, keep going, bro!
Thank You !
You really, professional , I like You my brother!!
3:08 i understand that the sigma curve gives a more accurate description of the data beeing 0 or 1 (or later in prediction the propability of it beeing 0 or 1). but as we have a binary problem the only important part of the curve is the point where 0,5 is crossed. So the linear curve does exactly the same job as the sigma curve when it is fitted so that it crosses at the same point.
Is the sigma curve in this simple example still better for some reason I dont get?
This was very helpful thank you
Glad it helped you!
Can not thankyou for such clear explanation. Susbscribed and liked.
I am glad that I could help.
very solid explanation, dude! thanks a lot! i combine ur videos with coursera tutorials as well :) As long as I am not good at math, I have a question what does b represent in the formula W^t+b? Thanks a lot!
Thank you so much! b is known as a parameter. It can take any singular, scalar, numeric value.
Wow, amazing explanation! Really thank you so much!
Happy to help!
you are doing well sir
WOW !! you are just Awesome
Thank you 😇
Good explanation 👍
Thank you!
very good explanations. I've subscribed right away
Thank you so much ! I appreciate your comment
hey what is B in the equation? error? i was watching Andrew Ng course and it gets confused sometimes, but you explained so well what all those functions means, letter by letter, i just missed the B value, is ist same as error ? bias? ... thanks mate. big shout out from brazil
Hi... thanks for the compliment. B is bias. Error is different from bias. Bias is just a parameter that we train.
Awesome! Super helpful
Loving your content
Thank you so much ! I am glad you found this helpful
Hi jay! thanks for your videos!. I had one doubt. Why is logistic regression a linear classifier when sigmoid function is a non linear function? when we couple the same logistic function and couple it in nueral network it is said to form a non linear classifier.
You can refer to this post: stats.stackexchange.com/questions/93569/why-is-logistic-regression-a-linear-classifier it might help
hi bro, thanks a lot for this amazing valuable video,
I have a question which is how we can find the weight??
can you help by an example or a reference that explain in a simple way??
Hi… have you watch my other videos on this topic? It might help you.😄
Loved It , Thanks
i love you
very good explanation
Thank you 😄😇
superb bro
Thanks
Wow,thank you ❤
Thansks re bhai
Welcome!
Thank you sir
Welcome 😇
why we applied the straight line in sigmoid curve as we substitute x by (wx+c), why we dont just stick with sigmoid function to make the predictions????
Zach star has a better video about this.
How would you do a confusion matrix?
You can checkout sklearn library for that. I will make videos on it soon as well.
Really Good! Thank you :)
Your welcome 😇
Your the best!
i gave you thumb up ( number 38) because you are good and my age is 38
Thank You so much 😊😇
Thank you
You're welcome
thank you sir you are doing well and this is a great job , and we will learn many thing with you and
Hey, thank you!
Its amazing.
When to use logistic regression and naive bayes learning?
Thanks
Coding Lane. Thanks for sharing your knowledge. I have sent you an email.
Hi Chan, I have replied to your mail. Sorry for the late reply, I was very busy last few weeks.
Bhai yaa kon si language ha
i want to know your name
Watashino namaewa Eren Yeager
Thanks for your great video .
probability
ˌpräbəˈbilədē
Thank you