Live Day 6- Discussing KMeans,Hierarchical And DBScan Clustering Algorithms

Поделиться
HTML-код
  • Опубликовано: 4 фев 2025

Комментарии • 88

  • @pollypravir5378
    @pollypravir5378 2 года назад +3

    Thanks

  • @AmirAli-id9rq
    @AmirAli-id9rq 2 года назад +20

    ek number session ... in easy terms ... BIAS is the inability of ML algorithm to capture the 100 percent or exact relationship. To understand bias one must think why do we need a ML in first place. In mathematics or physics we have absolute relationship or formula between dependent and independent variables like s=ut+1/2 at2 (std 7 Physics) or SI = P*R*T so for computing cases like we have absolute formula we don't need any ML algo. ML try to do the same i.e. estimate a formula, let say I want to calculate the purchasing power (P) so I train a model with different variables like income,age, family income and m model fetches a formula P = wo+ b1*income+b2*age + b3* family income..... So this formula is not absolute or universal as its derived by a specific ML algo for specific data but let say by miracle we derive a formula that exactly calculates the purchasing power with 100 percent accuracy so for that model bias is 0 as the model accurately captures the relationship..... Variance ---- Talking about variance, in short way the difference in fits between data set is called variance , imagine we used that same miracle formula in test data and data fits 100 percent as in we get 100 percent accuracy(for different test set) then we can say that the variance is 0 which means the ML formula is perfect or let say when use the same miracle formula in test set we get 50% accuracy which means the bias was low but variance is high as formula didnt work well with unseen (test) data... SO in an imaginary world if bias is 0 and variance is also 0 then my friend you have discovered a formula not an estimation .... In a practical world we aim for a model with low bias and low variance..... Subscribe Krish Channel if this helped

  • @gayanath009
    @gayanath009 10 месяцев назад

    Super Explanation as always. hats off

  • @rahulalladi2086
    @rahulalladi2086 3 года назад +12

    I got placed at tiger analytics
    Credit goes to u krish
    Your videos helped me to crack the interview

    • @rafibasha4145
      @rafibasha4145 3 года назад +1

      Hi Rahul ,congrats .please share interview quesions

  • @akhilbez88
    @akhilbez88 Год назад

    You are the best teacher that I have in my life in this domain,thanks a lot to share this kind of knowledge...

  • @geekyprogrammer4831
    @geekyprogrammer4831 3 года назад +16

    Good Evening Krish. Your contents is absolutely a gold mine. Please arrange Deep Learning sessions next :)

  • @yusmanisleidissotolongo4433
    @yusmanisleidissotolongo4433 10 месяцев назад

    Excellent, just excellent. Thanks

  • @vikascbr
    @vikascbr 2 года назад +1

    Good morning krish.. You have really made my foundation very strong before that I was null in statistic and machine learning since from non technical background.. Now I can read very high level books and could really understand.. You are really great value addition to my learning path..

  • @n.o.t.important
    @n.o.t.important 3 года назад +26

    A humble request to you @Krish, make next live streams on Deep Learning.

    • @Ishaheennabi
      @Ishaheennabi 3 года назад +3

      ya

    • @prashantkandarkar8993
      @prashantkandarkar8993 3 года назад +2

      Yes

    • @kkevinluke
      @kkevinluke 3 года назад +2

      I would EDA, cuz that is more applicable in the job scenarios, i.e. it depends on the role, but generally, most roles, require strong EDA knowledge, so, I would go for EDA 7 days. next,

    • @n.o.t.important
      @n.o.t.important 3 года назад +1

      @@kkevinluke looks like your opinion won. And I also agree with you.

  • @krishnadhawalapure
    @krishnadhawalapure Год назад

    you are one of the best teachers any student can have..❤

  • @kumarnityanand4731
    @kumarnityanand4731 2 года назад +1

    Excellent and knowledge gaining session and every second spend was gain. Thanks alot 😊 keeping helping and sharing the knowledge & concepts 💐💐💐

  • @abhishekpatil1106
    @abhishekpatil1106 2 года назад

    First thing First !
    Great session 👏 👌 👍

  • @augustinonyambile431
    @augustinonyambile431 15 дней назад

    Thanks a lot

  • @kaustubhkapare807
    @kaustubhkapare807 3 года назад +2

    Thank You

  • @Dovahkiin7994
    @Dovahkiin7994 2 года назад

    Thanks for this great Tutorial.

  • @KamalSingh-rt2bb
    @KamalSingh-rt2bb 2 года назад

    Hello sir I started every morning with a new session of machine learning. And last 6 days teach me a lot about machine learning algorithms. Thank you very much for this playlist.

  • @harshitsamdhani1708
    @harshitsamdhani1708 Год назад

    Thank you for the lecture

  • @pankajgoikar4158
    @pankajgoikar4158 2 года назад

    You are just amazing Sir. 😊

  • @gummalasaiteja961
    @gummalasaiteja961 2 года назад +1

    1.75 speed is he best way to watch and lot of information covered in less time

  • @gh504
    @gh504 3 года назад +1

    Amazing explanation thank you sir

  • @shubhamgupta09
    @shubhamgupta09 2 года назад +7

    Hi Sir, At 1:11:00, I think you had mistakenly spoken the wrong terms for High Bias & low Bias. It should be like for High Bias-> Not perform well, Low Bias-> Perform well. We use Low Bias & low variance for the Generalized Model as it performs well. Correct me if I am wrong.

    • @ashutoshmishra6920
      @ashutoshmishra6920 2 года назад

      Pata hai bsdk galti se boldiye sir iske liye comment krne ki jarurat nai thi gyaan mat chodo

  • @pankajkumarbarman765
    @pankajkumarbarman765 3 года назад

    Thank you so much sir❤️

  • @akarkabkarim
    @akarkabkarim 2 года назад

    Thank your sir Krish

  • @piyushsonekar1225
    @piyushsonekar1225 Год назад

    thanks! really want know about exact definition of bias & var
    great teaching

  • @ridoychandraray2413
    @ridoychandraray2413 2 года назад

    Krish Naik Sir is Awesome

  • @tanwilliam7351
    @tanwilliam7351 3 года назад +2

    Yes DEEP LEARNING NEXT!

  • @navalsehgal1015
    @navalsehgal1015 Год назад

    Keep it up.

  • @sandipansarkar9211
    @sandipansarkar9211 2 года назад +1

    finished watching

  • @spiropython
    @spiropython Год назад

    Hello sir take care of your health

  • @LearningWithNisa
    @LearningWithNisa Год назад

    Hello sir, you are doing great job. do you have any video related to OPTIC clustering?

  • @rafibasha4145
    @rafibasha4145 3 года назад +4

    Please cover XGboost'GBM and catboost in live videos so we can understamd learn better

  • @AmirAli-id9rq
    @AmirAli-id9rq 2 года назад +2

    at 1:11:31 , I guess its wrong if the model captures the good relationship(between dependent and independent variable) in data then it has low bias not high bias. Low bias means that model output the formula is flexible (low bias) to capture the relationship , high bias means that the accuracy is low and model is unable to capture the actual data points .. please verify guys

  • @ramdasprajapati7884
    @ramdasprajapati7884 Год назад

    Beautiful sir....

  • @raghavsharma8512
    @raghavsharma8512 2 года назад

    superb.....!!

  • @md.ishtiakrashid1523
    @md.ishtiakrashid1523 Год назад

    The video was very good. But how to calculate the feature importance after k-means clustering?

  • @mainakseal5027
    @mainakseal5027 Год назад

    east or west naik sir is suppper duper best

  • @rafibasha4145
    @rafibasha4145 3 года назад +2

    Please start mock interview sessions as well

  • @rohanwaghulkar3551
    @rohanwaghulkar3551 Год назад

    sir pls make video on homogeneity, completeness, V-measure and Davies-Bouldin Index

  • @ishwarsalunke1838
    @ishwarsalunke1838 Год назад

    Depends on the data points

  • @kkevinluke
    @kkevinluke 3 года назад

    Hello @Krish, thank you for the explanations. Please do an extensive depth in EDA sessions next. I appreciate your efforts very much, thanks again.

  • @darshanvala9224
    @darshanvala9224 2 года назад

    10 out of 10

  • @rakeshliparefms2
    @rakeshliparefms2 2 года назад

    Hi krish sir its learning from you.
    Can you please detailed video of Principle components analysis

  • @sandeepagarwal8566
    @sandeepagarwal8566 3 года назад +2

    Yes Deep learning course

  • @cloudengineer1348
    @cloudengineer1348 3 года назад +2

    Hi Krish, Are you planning to take ML (Deep Learning) session?

  • @sejalkale67
    @sejalkale67 3 года назад

    A humble request to you @Krish,make next live session streams on Machine learning practice and practicals

  • @hamzasabir6480
    @hamzasabir6480 Год назад

    Hello Krish! How it is possible to have 3 centroids when k=2 is specified as you told at 32:00 while introducing kmeans plus?

  • @mdyounusahamed6668
    @mdyounusahamed6668 2 года назад

    Please make some videos on soft clustering algorithm (ex. Fuzzy C Means)

  • @sridharbajpai420
    @sridharbajpai420 Год назад

    51:27 k means cant do cluster like this , kmeans created convex pattern in data

  • @harshgupta3641
    @harshgupta3641 2 года назад

    This video is incredible, and very well explained . But if we have more than one feature in our dataset, should we make the feature selection first and then perform the elbow test?

  • @rafibasha4145
    @rafibasha4145 3 года назад

    Please let me know on which kind of data like linear ,non linear etc which algorithm works better

  • @devkumaracharyaiitbombay5341
    @devkumaracharyaiitbombay5341 2 месяца назад

    5:37 because it is 2022

  • @amritakaul87
    @amritakaul87 2 года назад

    @KRISHNAIK SIR, KINDLY PROVIDE THE DBSCAN VIDEO LINK

  • @BhavyaArora-co2wd
    @BhavyaArora-co2wd 8 месяцев назад

    Could someone share github link which is being referenced at 51:51?

  • @minhaoling3056
    @minhaoling3056 3 года назад +2

    will you do deep learning series?

  • @ankan54
    @ankan54 2 года назад

    What are the type of Biases can there be in a dataset? how to answer this question ?

  • @ishwarsalunke1838
    @ishwarsalunke1838 Год назад

    Silhouette score

  • @dukesoni5477
    @dukesoni5477 2 года назад

    Mil gya bhai ml padhna ka channel ekdum maja aagya sir

  • @dataanalyst1012
    @dataanalyst1012 2 года назад

    In k means clustering, is there an assumption in numbers of observations and variables? Would having variables greater than observation affect the results of clustering and make it less accurate?

  • @paneercheeseparatha
    @paneercheeseparatha Год назад

    K means clustering is not mathematically clear. The line you're drawing connecting the two centroids is ok, but how does that perpendicular line drawn. means how is that perpendicular line decided? Also for any new point, will that line be used to classify for k nearest neighbours is to be used?

  • @zahrasiraj766
    @zahrasiraj766 3 года назад

    sir can you make an urgent lecture on cluster labeling problem ?? document cluster labeling thing ? and what if we enhance this issue as hierarchical cluster labeling thing ?

  • @a.chitrranshi
    @a.chitrranshi 3 года назад +1

    Quick qq. High bias meaning better accuracy. ??

  • @AkashdeepDixit-x2h
    @AkashdeepDixit-x2h Год назад

    how to find eps and impis in dbsan

  • @kkevinluke
    @kkevinluke 3 года назад +1

    Is the silhouette score applicable to hierarchical clustering? as some clusters are within other clusters. How do we differentiate a(i) from b(i) then?

  • @anubhabsaha3760
    @anubhabsaha3760 Год назад

    Andrew NG of INDIA==Krish Naik Sir

  • @arpitaingermany
    @arpitaingermany Год назад

    I don't understand after knowing the clusters we draw the histogram in hierarchical clustering and you are showing we need to draw a parallel like and the number of vertical lines it intersects will be number of clusters?? I mean we already drawing the histogram based on the clusters. Doesn't make sense what you told.

  • @ALLINONEMEDIA33
    @ALLINONEMEDIA33 7 месяцев назад

    Can I've the git hub link here please 😵‍💫

  • @dataanalyst1012
    @dataanalyst1012 2 года назад

    Hello sir. Do you, by any chance, know about the assumptions of k means cluster analysis in the case of large variance?

  • @bhupeshmahara
    @bhupeshmahara 2 года назад

    Sir, if low bias - high variance is overfitting and high bias - high variance is underfitting , then what is high bias - low variance ?

    • @shubhamnaik9555
      @shubhamnaik9555 2 года назад

      That is practically not possible because u will not get a model that performs bad on training data but somehow performs well on test data.

  • @harshavardhansvlkkb2290
    @harshavardhansvlkkb2290 3 года назад

    10/10

  • @Aman-x2v6j
    @Aman-x2v6j 10 месяцев назад

    Sir can you please provide the github link?

  • @basavarajag1901
    @basavarajag1901 2 года назад

    can i know the matrial link ?

  • @SidIndian082
    @SidIndian082 2 года назад

    silhouette Code is dam tough to understand Sir 😞

  • @RumiAnalytics2024
    @RumiAnalytics2024 2 года назад

    I didnt find the githuub link sir

  • @deepsarkar2003
    @deepsarkar2003 3 года назад

    Where is the Github link for this?

  • @siddhantkohli5063
    @siddhantkohli5063 2 года назад

    Sir pls make a video ON pea

  • @shreyasnatu3599
    @shreyasnatu3599 3 года назад

    anyone knows where I can get data science/ml internships? I am in third yr of comp eng

  • @parthshah5482
    @parthshah5482 Год назад

    silhoit score