Комментарии •

  • @nandinibalyapally3388
    @nandinibalyapally3388 3 года назад +90

    I never understood what is a gradient descent and a cost function is until I watch this video 🙏🙏

  • @mohitpatel7876
    @mohitpatel7876 4 года назад +33

    Best explanation of cost function, we learned it as masters students and the course couldnt explain it as well.. simply brilliant

  • @navjotsingh8372
    @navjotsingh8372 2 года назад +3

    I have seen many teachers explaining the same concept, but your explainations are next level. Best teacher.

  • @soumikdutta77
    @soumikdutta77 2 года назад +4

    Why am I not surprised with such a lucid and amazing explanation of cost function, gradient descent,Global minima, learning rate ...may be because watching you making complex things seems easy and normal has been one of my habit. Thank you SIR

  • @manikaransingh3234
    @manikaransingh3234 4 года назад +33

    I don't see a link on the top right corner for the implementation as you said in the end.

  • @koushikkumar4938
    @koushikkumar4938 3 года назад +6

    Implementation part:
    Multiple linear Regression - ruclips.net/video/5rvnlZWzox8/видео.html
    Simple linear Regression - ruclips.net/video/E-xp-SjfOSY/видео.html

  • @RJ-dz6ie
    @RJ-dz6ie 4 года назад +4

    How can I not say that you are amazing !! I was struggling to understand the importance of gradient descent and u cleared it to me in the simplest way possible.. Thank you so much sir :)

  • @shubhamkohli2535
    @shubhamkohli2535 3 года назад

    Really awesome video , so much better than many famous online portals charging huge amount of money to teach things.

  • @tarunsingh-yj9lz
    @tarunsingh-yj9lz 11 месяцев назад

    Best video on youtube to understand the intution and math(surface level) behind Linear regression.
    Thank you for such great content

  • @anuragmukherjee1878
    @anuragmukherjee1878 2 года назад +28

    For those who are confused.
    The convergence derivative will be dJ/dm.

    • @tusharikajoshi8410
      @tusharikajoshi8410 Год назад

      what's J in this? Y values? I'm super confused about this d/dm of m, cz it would be just 1. and m I think is just total number of values. Shouldn't the slope be d/dx of y?

    • @mdmynuddin1888
      @mdmynuddin1888 Год назад

      @@tusharikajoshi8410 it will be the cost or loss (J)

    • @mdmynuddin1888
      @mdmynuddin1888 Год назад +2

      new(m) = m- d(loss or cost)/dm * Alpha(learning rate.

    • @suhasiyer7317
      @suhasiyer7317 10 месяцев назад

      Super helpful

    • @threads25
      @threads25 8 месяцев назад

      I'dont think because it netwons method actually

  • @animeshkoley6478
    @animeshkoley6478 3 года назад +2

    Best explanation of Linear Regression🙏🙏🙏.Simply wow🔥🔥

  • @anuragbhatt6178
    @anuragbhatt6178 3 года назад +1

    The best I've come across on gradient descent and convergence theorem

  • @shailesh1981able
    @shailesh1981able 2 года назад

    Awesome!! Cleared all doubts seeing this video! Thanks alot Mr. Krish for creating indepth content on such subject!

  • @supervickeyy1521
    @supervickeyy1521 4 года назад

    i knew the concept of Linear Regression but didn't know the logic behind it.. the way Line of Regression is chosen. Thanks for this!

  • @pranitaumarji5224
    @pranitaumarji5224 4 года назад +4

    Thankyou for this awesome explanation!

  • @FaizanKhan-fn6ew
    @FaizanKhan-fn6ew 4 года назад +1

    Thanq so much for all your efforts.... Knowledge, rate of speech and ability to make thing easy are nicest skill that you hold...

  • @skviknesh
    @skviknesh 3 года назад +1

    Great! Fantastic! Fantabulous! tasting the satisfaction of learning completely - only in your videos!!!!!

  • @PRASHANTSHARMA-ev7rr
    @PRASHANTSHARMA-ev7rr 4 года назад +4

    Hi Sir, I am from cloud & DevOps background is it make sense to go & learn Ml AI, what path I can follow to become a dataops engineer or devops ml ai engineer.

  • @harshdhamecha5503
    @harshdhamecha5503 3 года назад +34

    There's a little correction in Convergence Theorem:
    derivative of J(m) should be there in place of derivative of m in numerator.

    • @salmanjaved2816
      @salmanjaved2816 3 года назад

      Correct 👍

    • @xanderkristopher1412
      @xanderkristopher1412 2 года назад

      sorry to be so off topic but does anyone know of a way to get back into an instagram account..?
      I was stupid lost my password. I love any tricks you can give me.

  • @nanditagautam6310
    @nanditagautam6310 2 года назад +1

    This is the best stuff i ever came across on this topic !

  • @azizahmad1344
    @azizahmad1344 2 года назад +2

    Such a great explanation of gradient descent and convergence theorem.

  • @rambaldotra2221
    @rambaldotra2221 3 года назад

    Thank You Sir, You have explained everything about gradient Descent in the best possible easiest way !!

  • @dhainik.suthar
    @dhainik.suthar 3 года назад +4

    This maths is same as coursera machine learning courses
    Thank you sir for this great content ..

  • @priyanshusharma2516
    @priyanshusharma2516 2 года назад +4

    Watched this video 3 times back to back .Now its embaded in my mind forever. Thanks Krish , great explanation !!

  • @V2traveller
    @V2traveller 4 года назад

    every line you speak..so much important to understand ths concept......thank u

  • @PritishMishra
    @PritishMishra 3 года назад

    I knew that their will be an Indian that can make all the stuffs easy !! Thanks Krish

  • @mayureshgawai5951
    @mayureshgawai5951 3 года назад

    No one can find easiest explanation of gradient descent on youtube. This video is the exception.

  • @nidhimehta9278
    @nidhimehta9278 3 года назад

    Best video on theory of linear regression! Thankyou soo much Krish!

  • @pradeepmallampalli6510
    @pradeepmallampalli6510 3 года назад

    Thank you Soo much Krish. No where I could find such a detailed explanation
    You made my Day!

  • @annapurnaparida7655
    @annapurnaparida7655 3 года назад

    So beautifully explained...did not find anywhere this kind of clarity....keepnup the good work....

  • @moulisiramdasu6753
    @moulisiramdasu6753 3 года назад

    Really thanks you krish.
    you just cleared my doubts on cost function and gradient descent. First I saw Andrew Ng class but have few doubts after seeing you video. Now its crystal clear..
    Thank You...

  • @ngarwailau2665
    @ngarwailau2665 2 года назад

    Your explanations are the clearest!!!

  • @kunaltibrewal2881
    @kunaltibrewal2881 4 года назад +2

    It would be great if you could suggest some best books for python programming?

  • @arunsundar489
    @arunsundar489 4 года назад +6

    Please add the indepth math intution of other algorithms like logistic, random forest, support vector and ANN.. Many Thanks for the clearly explained abt linear regression

  • @varshadevgankar8242
    @varshadevgankar8242 3 года назад +3

    sir i can' find the simple regression and multiple regression video as u said and some videos are little jumbled its getting difficult
    to follow the videos and plz do explain the functionalities of each and every keyword or a inbuilt function when ur explaining the code...ofcourse ur explaining in a very good way but i faced a liitle problem while folllowing that practical implementation of univariate,multivariate,and bivariate analysis(there you have used FACETGRID function)..so will u plz expalin me what is the exact use of facetgrid...?

  • @magelauditore333
    @magelauditore333 4 года назад

    Sir, you are outstanding. Please keep it up

  • @ayurdubey4818
    @ayurdubey4818 2 года назад +8

    The video was really great. But I would like to point out that the derivative that you took for convergence theorem, there instead of (dm/dm) it should be derivative of cost function with respect to m . Also a little suggestion at the end it would have been helpful, if you mentioned what m was, total number of points or the slope of the best fit line. Apart from this the video helped me a lot hope you add a text somewhere in this video to help the others.

  • @9902152322
    @9902152322 2 года назад

    god bless you too sir, explained very well. basics helps to grow high level understanding

  • @ankitchauhan6629
    @ankitchauhan6629 3 года назад +1

    What about the C (intercept) value? how does the algorithm selects the C value?

  • @nikhil1303
    @nikhil1303 4 года назад +4

    This is a really good explanation for Linear Regresison, Krish.. looking forward to check out more of your videos..thanks and keep going!!

  • @w3r161
    @w3r161 3 месяца назад

    Thank you my friend, you are a great teacher!

  • @shhivram929
    @shhivram929 3 года назад +3

    Hi krish, that was an awesome explanation of Gradient Descent. With respect to finding the optimal slope.
    But in linear regression both slope and the intercept are tweakable parameters, how do we achive the optimal intercept value in linear regression.

  • @pearlbabbar7981
    @pearlbabbar7981 2 года назад

    Amazing explanation! I have one question, from where did you study all of this? Some books or the net?

  • @shan5612
    @shan5612 3 года назад +2

    Great,but not able to find the link for how to implement in python,plz awaiting for your valuable reply.

  • @deeptigoyal4342
    @deeptigoyal4342 3 года назад

    one of the best explanation so far :)

  • @dan684al
    @dan684al 3 года назад

    where is the link on the top right corner for the implementation as you said in the end?

  • @ahmedbouchou6893
    @ahmedbouchou6893 4 года назад +3

    Hi . Can you please do a video about the architecture of machine learning systems in real world . How does really work in real life .for example how hadop (pig,hive) , spark, flask , Cassandra , tableau are all integrated to create a machine learning architecture. Like an e2e

  • @aayushsuman4592
    @aayushsuman4592 3 месяца назад

    Thank you so much, Krish!

  • @mellowftw
    @mellowftw 3 года назад

    Thanks so much sir.. you're doing good for the community

  • @tamellasivasubrahmanyam6683
    @tamellasivasubrahmanyam6683 4 года назад

    you are ultimate, got answers to some many questions, video is good.

  • @pjanjanam
    @pjanjanam 2 года назад +1

    A small comment at 17:35. I guess it is Derivative of J(m) over m. In other words, the rate of change of J(m) over a minute change of m. That gives us the slope at instantaneous points, especially for non linear curves when slope is not constant. At each point of "m, J(m)", Gradient descent travels in the opposite direction of slope to find the Global minima, with the smaller learning rate. Please correct me if I am missing something.
    Thanks for a wonderful video on this concept @Krish, your videos are very helpful to understand the Math intuition behind the concepts, I am a super beneficiary of your videos, Huge respect!!.

  • @vishwashah4109
    @vishwashah4109 3 года назад

    Best explanation. Thank you!

  • @hardikshah_99
    @hardikshah_99 3 года назад

    Implementation Links:
    Simple Linear Regression: ruclips.net/video/E-xp-SjfOSY/видео.html
    Multiple Linear Regression: ruclips.net/video/5rvnlZWzox8/видео.html

  • @nikifoxy69
    @nikifoxy69 3 года назад

    Loved it. Thanks Krish.

  • @meetbardoliya6645
    @meetbardoliya6645 2 года назад

    Value of the video is just undefinable! Thanks a lot :)

  • @ShiVa-jy5ly
    @ShiVa-jy5ly 3 года назад

    Thankyou sir...Get to learn so much from you.

  • @vishnuppriya5263
    @vishnuppriya5263 Год назад

    Really great sir. I very much thank you sir for this clear explanation

  • @shrikantlandage7305
    @shrikantlandage7305 4 года назад

    my god that was clear as crystal...thanks krish

  • @SanjeevKumar-dr6qj
    @SanjeevKumar-dr6qj Год назад

    Great sir. Love this video

  • @shaktirajsinhzala4588
    @shaktirajsinhzala4588 3 года назад

    Sir,there is no playlist of this series where can I found that? About cdf,pdf...

  • @python_by_abhishek
    @python_by_abhishek 3 года назад +7

    Before watching this video I was struggling with the concepts exactly like you were struggling in plotting the gradient descent curve. ☺️Thanks for explaining this beautifully.

  • @wellwhatdoyakno6251
    @wellwhatdoyakno6251 2 года назад

    lovely! love it.

  • @auroshisray9140
    @auroshisray9140 3 года назад

    Thank you Krish bhaiya!

  • @arrooow9019
    @arrooow9019 3 года назад

    Oh my gosh this is awesome tutorial I ever seen God bless you sir🤩🤩

  • @dheerajnuka6245
    @dheerajnuka6245 4 года назад

    Hi, Tried searching links but not able to found could you share the link for better practice. Thanks

  • @Neuraldata
    @Neuraldata 3 года назад +1

    We would also recommend your videos to our students!

  • @RanjithKumar-jo7xf
    @RanjithKumar-jo7xf 2 года назад

    Nice Explanation, I like this.

  • @juozapasjurksa1400
    @juozapasjurksa1400 2 года назад

    Thank you! This video was so good!

  • @nayanjain3594
    @nayanjain3594 3 года назад

    Hi Krish, how to calculate the intercept value as in this we have initialized it to 0 and we have not calculated at the end. We have calculated only slope of best fit line.

  • @mohitpatel7876
    @mohitpatel7876 4 года назад +6

    At 14:56, how do we decide how many slope values to try? and how about selecting intercepts in a certain range?..

    • @ruchit9697
      @ruchit9697 4 года назад

      The trials of slope selections go until the cost function reaches the local minima point ....and for intercept there are some random initialization techniques through which a fixed value is set for intercept....

  • @ishanthakur3315
    @ishanthakur3315 3 года назад

    You are taking derivative of cost function w.r.t. m in convergence theorem? Please reply!

  • @shchiranth6626
    @shchiranth6626 2 года назад

    Great Tut sir got things pretty quick with this video ty

  • @priyankachoubey4570
    @priyankachoubey4570 2 года назад

    As always Krish very well explained!!

  • @namithapr4966
    @namithapr4966 4 года назад

    Can you please provide the attached video link in this tutorial. I cannot find it

  • @rezafarrokhi9871
    @rezafarrokhi9871 3 года назад +5

    Thanks for all great prepared videos, I think you meant (deriv.J(m) / deriv(m)) at 17'.45", is it correct?

  • @kevinsusan3345
    @kevinsusan3345 4 года назад +2

    I had so much difficulty in understanding gradient descent but after this video
    It's perfectly clear

  • @dileepkumar-nd1fo
    @dileepkumar-nd1fo 3 года назад

    Hi Krish,
    Where can i find previous video on Linear Regression ?

  • @nivitus9037
    @nivitus9037 4 года назад +2

    Great...

  • @cutecreature_san
    @cutecreature_san 3 года назад

    your videos are clear and easy to understand

  • @cynthialobo1995
    @cynthialobo1995 4 года назад

    Very nice explanation.
    Thank you.

  • @akrsrivastava
    @akrsrivastava 4 года назад +28

    Hi Krish, Thanks for the video. Some queries/clarifications required:
    1. We do not take gradient of m wrt m. That will always be 1. We take the gradient of J wrt m
    2. If we have already calculated the cost function J at multiple values of m, then why do we need to do gradient descent because we already know the m where J is minimum
    3. So we start with an m , calculate grad(J) at that point and update m with m' = m - grad(J)* learn_rate and repeat till we reach some convergence criteria
    Please let me know if my understanding is correct.

    • @slowhanduchiha
      @slowhanduchiha 3 года назад

      Yes this is correct

    • @vamsikrishna4107
      @vamsikrishna4107 3 года назад

      I think we have to train the model to reach that min. loss point while performing grad. descent in real life problems.

    • @shreyasbs2861
      @shreyasbs2861 3 года назад

      How to find best Y intercept ?

  • @mvcutube
    @mvcutube 3 года назад

    Nice tutorial. Thank you

  • @sahilswaroop1996
    @sahilswaroop1996 4 года назад

    excellent video u are a champion man

  • @debrupdey7948
    @debrupdey7948 Год назад

    great video sir, so lucid

  • @ajinkyadeshmukh9674
    @ajinkyadeshmukh9674 2 года назад

    Hi Krish, That was an awesome explanation for the maths used for linear regression, especially for the cost function. Can you make a video on 5 assumptions of linear regression and also explain the assumptions in detail.

  • @PavanKumar-xg8ye
    @PavanKumar-xg8ye 3 года назад

    Excellent!!!!!

  • @avinashgote2770
    @avinashgote2770 Год назад

    good expplanation now clear all queries

  • @sharikatv1989
    @sharikatv1989 3 года назад

    This is super helpful!

  • @rakeshenjapuri3143
    @rakeshenjapuri3143 3 года назад

    why we are using cost function and gradient sir what is the concluson
    ? can we apply as well multilinear and logistic regression also?

  • @pradnyavk9673
    @pradnyavk9673 Год назад

    very well explained Thank you.

  • @glamgalmanu
    @glamgalmanu 4 года назад +2

    can you do more math intuition s please. These are very helpful. Thanks!

  • @Dinesh-uh4gw
    @Dinesh-uh4gw 3 года назад

    Excellent Explanation

  • @Uma7473
    @Uma7473 4 года назад +1

    can u upload a video multi-label classification with eg using scikit-learn

  • @jaspreetsingh5334
    @jaspreetsingh5334 2 года назад

    Thanks Krish u are helping alot

  • @nurali2525
    @nurali2525 2 года назад

    This guy was born to teach

  • @satyamrout1400
    @satyamrout1400 4 года назад

    sir i think link for practical implementation is not provided can u pls give that link?

  • @kushshri05
    @kushshri05 4 года назад +3

    Plz try to upload videos on this series in span of 2 days...

  • @sampathkumarmanchala2237
    @sampathkumarmanchala2237 4 года назад

    Doubt: Does model doest not select best slope by itself?

  • @dhruv1324
    @dhruv1324 11 месяцев назад

    never found a better explaination

  • @user-ec9he3nz7f
    @user-ec9he3nz7f 3 месяца назад +1

    really great explanation sir 😍