K-Mean Clustering

Поделиться
HTML-код
  • Опубликовано: 11 сен 2024
  • Data Warehouse and Mining
    For more: www.anuradhabha...

Комментарии • 208

  • @sdelagey
    @sdelagey 4 года назад +50

    Finally somebody that actually show calculations every step! Thank you so much you have my like!

  • @MechBasketMkII
    @MechBasketMkII 3 года назад +5

    3 years and this is still very much useful! Thank you so much.

  • @phanurutpeammetta2066
    @phanurutpeammetta2066 3 года назад +7

    you saved my life! wish me luck. I'll have an exam in the next 2 days. Hopefully, I can utilize all that you taught in this video. Keep on great work maim!

  • @Me-dq5eo
    @Me-dq5eo 4 года назад +5

    I Finally understand the math behind this process. Thank you for walking through with actual data. This helps tremendously!

  • @joeaustinathimalamaria624
    @joeaustinathimalamaria624 5 лет назад +2

    Helped a lot. Can't thank this lady enough. Just a small correction : Distance is (x-a)^2 + (y-b)^2

  • @PritishMishra
    @PritishMishra 3 года назад +1

    Mam, the explanation was CRYSTAL CLEAR. Thanks! keep making these types of tutorials. It really really helps

  • @0youtubing0
    @0youtubing0 6 лет назад +42

    Thanks for the informative video ! @3:15, the variable should be 'y', instead of 'x', (y-b)

  • @johnmosugu
    @johnmosugu Год назад

    You made this look so clear and understandable. I sincerely appreciate you for this all-important K-means computation video!

  • @IlliaDubrovin
    @IlliaDubrovin Год назад

    OMG, I've been looking for this for so long!!! You are the QUEEN!!!

  • @OviPaulDgRimJoW
    @OviPaulDgRimJoW 6 лет назад +4

    thank you very much, I was really confused how the implementation of this algorithm would be but you made it really easy to understand.

  • @surbhiagrawal3951
    @surbhiagrawal3951 4 года назад +1

    The best video seen so far on K-means

  • @prasadnagarale6274
    @prasadnagarale6274 5 лет назад +4

    i think you did it for one iteration only; but in next iteration maybe any point can change its cluster as the two means are changed.
    So basically we need to reiterate same procedure unless cluster mean value does not change for two consecutive iterations.

    • @prasannavi1911
      @prasannavi1911 5 лет назад

      Prasad Nagarale agreed. I have one question.which centroid values we should consider for next iteration..

  • @akankshamishra1139
    @akankshamishra1139 Год назад

    Thank you Anuradha ji. Finally I understood what is K, what is mean, what is centroid, what is euclidean distance. Please create more videos covering major ML algorithms.

  • @gopalakrishnachinta3769
    @gopalakrishnachinta3769 Год назад

    Your way of Explanation is easy to grasp Maam, Thank you 😇

  • @kowsisweety9113
    @kowsisweety9113 5 лет назад +4

    Hi mam,
    Good and neat explation for k- means algorithm it was very useful for me
    I need a explanation of CLARA and CLARANS in partitioning algorithm for my exam ..

  • @computology
    @computology 2 года назад +1

    Why you are updating Mean of cluster after every assignment? Aren't we suppose to update the mean after completion of single-one iteration according to the original Algorithm?

  • @callm3pc
    @callm3pc 3 года назад +1

    Really straightforward and easy to understand.

  • @nehac9035
    @nehac9035 4 года назад

    Formula of calculating Euclidean distance is needs to update as it contains (x-a)+(x-b) but it should be (x-a)+(y-b) also check square of 17, it should be 283 instead of 283.

  • @sandeeprawat3485
    @sandeeprawat3485 3 года назад

    This example is a bit difficult, you can simply take 2 rows directly and group it as 1 & 2, then find the Eucledian distance from each row and the shortest distance is your new row's group from 1 & 2, and the mean will be the grouped rows sum/2.
    Now the real concept who might go through my comment, K- means clustering is finding the way to group the similar set of data (of any type actually), then why we need a mean here?
    1. When you calculate the distance from one point to other point you simply take a-b ( and you know that a>b, however this may not be possible in graphs or 3 dimensional plot, so you take square of sum of the distances for 2 values x,y and then you take a root so that if in a-b a

  • @aleeibrahim8672
    @aleeibrahim8672 6 лет назад +2

    Shouldn't we update the centroid when we find all the distances between every data point and the previous centroid? you updated it with the average of the first two points only, why?

  • @AnselmGriffin
    @AnselmGriffin 3 года назад

    Are you sure your values are right ? At 9 min 31 secs the new K1 should be (185+179+182)/3 and (72+68+72)/3 . Also at 10 mins 20 secs k1 = (185 + 179 + 182 + 188)/4 and k2 = (72+68+72+77)/4 final centroid should be 183.5000 72.2500 for K1 and 169.0000 58.0000 for K2. I ran it in Matlab and Matlab confirms these answer.

  • @paulourbanosegoper1216
    @paulourbanosegoper1216 2 года назад

    woaaaa really really help me to understand more about K-mean

  • @amilcarc.dasilva5665
    @amilcarc.dasilva5665 5 лет назад +3

    Thanks a lot. Systematic explanation and crystal clear.

  • @dnfac
    @dnfac 4 года назад +1

    Really simple and clear to understand, congratulations!

  • @nadellagayathri
    @nadellagayathri 3 года назад

    Anuradha great work. No where I got this detailed explanation.Please try to do videos for deep learning algorithms in a detailed way like this.

  • @shalinisoni4257
    @shalinisoni4257 4 года назад

    Your vedio Easily Understand.... Very Nice ma'am

  • @sathwik98
    @sathwik98 5 лет назад +2

    Thanks for being my Savior for 10 marks.

  • @anshumansingh6969
    @anshumansingh6969 5 лет назад +7

    mean is calculated in a wrong manner ,we have to take avg of all value in our set when ever some new value is added..

    • @tahaali01
      @tahaali01 5 лет назад +1

      you are correct

    • @vaddadisairahul2956
      @vaddadisairahul2956 3 года назад

      I think mean is not calculated every step based on yours and her explaination. First , we assign all the data points to their nearest cluster and then take average of all the points in a cluster as a whole

  • @jiyuu329
    @jiyuu329 3 года назад

    what is the point of finding the distance between the two initial clusters?
    the pts themselves are the centroids for their respective cluster, right?

  • @tucuptea3689
    @tucuptea3689 Год назад

    Ma'am, how do we know that we have assigned the initial 2 items in the right clusters?

  • @amanpathak2630
    @amanpathak2630 3 года назад +1

    Great explanation

  • @varadaraajjv4973
    @varadaraajjv4973 4 года назад

    Excellent Video....Thanka lot mam...you saved my time

  • @Lens_lores
    @Lens_lores 6 лет назад

    Thank you, Anuradha for such a comprehensive example.

  • @rajakcin
    @rajakcin Год назад

    Thanks for nice explanation, it helps.

  • @TheKnowledgeGateway498
    @TheKnowledgeGateway498 3 года назад +1

    What was the relevance of (0,21.93) values. There was no point of calculating that.

  • @coolbreeze007
    @coolbreeze007 Год назад

    Thanks for the amazing job.

  • @janaspirkova4181
    @janaspirkova4181 5 лет назад +1

    Dear Anuradha, thank you so so much.

  • @yashthaker9288
    @yashthaker9288 5 лет назад +2

    Thank you so much ma'am for amazing explanation!

  • @MrHardrocker98
    @MrHardrocker98 4 года назад

    why u didn't do update means as you did in single dataset video?

  • @nimrafaryad4103
    @nimrafaryad4103 Год назад +1

    Thanks mam 👍🏼Jazak Allah mam

  • @lavanyarajollu4122
    @lavanyarajollu4122 4 года назад +1

    Best one from all others👏

  • @nynebioglu
    @nynebioglu 5 лет назад +1

    Great explanation for K-means!
    Thanks.

  • @pratheekhebbar2677
    @pratheekhebbar2677 2 года назад

    a big thanks to you for this wonderful explanation

  • @ashutoshanadkarni4588
    @ashutoshanadkarni4588 5 лет назад

    Well explained. Just a minor suggestion. Most people watch on mobile so would be good to use entire screen rather than static title on left. I liked vedio

  • @Naweeth03
    @Naweeth03 5 лет назад

    Hi mam,
    i want answer to this question -- Assume you are given n points in a D-dimensional space and a integer k. Describe the k-means ++ algorithm for clustering the points into k cluster

  • @xc295
    @xc295 3 года назад

    The centroid coordinates are continuously changing. Initially, we took co-ordinates of points 1 and 2 as two centroids, so should we not re-check if points 1 and 2 still belong to the initial cluster to which they were assigned?

  • @kavibharathi2913
    @kavibharathi2913 Год назад

    Very very useful 👍 Thank u so much......💫

  • @br1batman287
    @br1batman287 Год назад

    In the next dataset means 3rd one the value should be 17 square should be 289 it is 283 Ik the answer is correct just for informing

  • @MegaDk13
    @MegaDk13 6 лет назад

    cluster assignment for the first 2 clusters is an assumption though we can justify it by euclidean distance calculation

  • @20shwetha
    @20shwetha 2 года назад

    very very useful video thank you so much madam.

  • @bhartinarang2078
    @bhartinarang2078 7 лет назад

    Wow 💪 Now we will have DWM vidoes. thanks madam. please keep them coming, your content is helping us and yes the BDA paper was lengthy, but your vidoes covered 30 marks or more altogether. Page ranks, sums, FM Algorithm.

    • @AnuradhaBhatia
      @AnuradhaBhatia  7 лет назад

      Thanks,
      Will surely put .

    • @AnuradhaBhatia
      @AnuradhaBhatia  7 лет назад +1

      Uploaded, Hierarchical Agglomerative Clustering, and Apriori Algorithm.

    • @bhartinarang2078
      @bhartinarang2078 7 лет назад

      Anuradha Bhatia yes madam, I got the notification :) thanks.

    • @AnuradhaBhatia
      @AnuradhaBhatia  7 лет назад +1

      More following.

    • @bhartinarang2078
      @bhartinarang2078 7 лет назад

      Anuradha Bhatia That's great. Waiting eagerly :)

  • @RishabVArun
    @RishabVArun 3 года назад +1

    Don't update cluster centroid after every assignment update it after a whole iteration (all assignments in one iteration is complete)

  • @navulurinitishkumar3911
    @navulurinitishkumar3911 2 года назад

    Can we taken randomly any two initial centroids ?

  • @jiayuliao4358
    @jiayuliao4358 5 лет назад +1

    very clear explanation!

  • @heartborne123
    @heartborne123 4 года назад

    what is the point to calculate distance from centroid 1 to centroid 1 and from centroid 2 to centroid 2? isn't it obvious distance in this case gonna be 0 ?

  • @justateenager9773
    @justateenager9773 3 года назад

    Really Excellent Mam...

  • @agnimchakraborty1112
    @agnimchakraborty1112 3 года назад +1

    Ma'am in which cluster should we assign the coordinate(or data point) if the euclidean distance is the same from both the clusters?

    • @ujjwalkarnani1059
      @ujjwalkarnani1059 3 года назад +1

      You can assign it in any of the 2 in that case

  • @spandanamanoj4890
    @spandanamanoj4890 5 лет назад

    Wonderful lecture Mam... For more videos, the link is not working... Kindly post more videos on Datamining. Thank you.

  • @sachinjagtap8936
    @sachinjagtap8936 2 года назад

    Great stuff, thanks for explaining.

  • @shabnamparveen6785
    @shabnamparveen6785 3 года назад

    very good session...but there should be X-a and Y-b.

  • @coolestcatintown1501
    @coolestcatintown1501 6 лет назад +1

    Perfectly clear, thank you very much.

  • @diego.777ramos
    @diego.777ramos 5 лет назад +1

    excelente canal muchas gracias por tu conocimiento , like y suscripcion.

  • @saurabhzinjad7249
    @saurabhzinjad7249 5 лет назад +8

    15 square is 225. Not 255. Please focus on small mistakes. Mean calculation is also wrong.

    • @alex-ek8vt
      @alex-ek8vt 4 года назад

      Ich schick dich gleich ins Vakuum!

  • @APREETHAMCS
    @APREETHAMCS 4 года назад +1

    Great! Thank you for the video!

  • @amarprasanth2614
    @amarprasanth2614 3 года назад

    wouldn't the equation be = root( (x-a)^2 + (y-b)^2 )
    (Maybe small mistake)

  • @nikhilmkumar2765
    @nikhilmkumar2765 5 лет назад

    Thanks a lot, ma'am. Helped me for my exam!

  • @liyaelizabethantony5675
    @liyaelizabethantony5675 3 года назад

    ar 3:50 its y-b ( euclidean distance) and not x-b

  • @nvsk.avinash2257
    @nvsk.avinash2257 5 лет назад

    Why didn't we recalculate the cluster center in K-means clustering for a single dataset

  • @just_a_viewer5
    @just_a_viewer5 Год назад

    4:53 - 15^2 is 225, not 255

  • @ghoshdipan
    @ghoshdipan 3 года назад

    How can we find the K value for a large data set?

  • @dasarojujagannadhachari8015
    @dasarojujagannadhachari8015 5 лет назад +1

    Wonderful lecture mam.. thank you

  • @mebrunoo
    @mebrunoo 6 лет назад +1

    thank you for good and detailed explained

  • @SantoshRaj-hx7rx
    @SantoshRaj-hx7rx 5 лет назад

    Thanks for the explanation.. it's is very clear...

  • @geekaffairs6475
    @geekaffairs6475 2 года назад

    Again the formula needs correction, I don't know why you are not correcting while going through it.

  • @youtubeuser9372
    @youtubeuser9372 3 года назад

    Thank you mam for detailed explanation

  • @amitbaderia4194
    @amitbaderia4194 6 лет назад

    Correct typo, 17 sqr = 289 (not 283)

  • @mailanbazhagan
    @mailanbazhagan 6 лет назад

    great and easily understandable explanation.

  • @tshende02
    @tshende02 Год назад

    Thank you so much mam,, love 😍 ❤️

  • @sid22april
    @sid22april 4 года назад

    Is this a correct approach lol? Don't we update the cluster centroid post assignment of all points and then iterate it till the clusters don't change. I am new to this. Enlighten me peeps.

  • @rubyangel1469
    @rubyangel1469 6 лет назад

    It helped 👌..thank you..but a simple mistake in the video which i noticed is square(15)=225 dear ..255 written there...

  • @mariomartinsramos6450
    @mariomartinsramos6450 6 лет назад

    Best explanation ever about k-means!
    Thanks!!

  • @PR-ql7tg
    @PR-ql7tg 2 года назад

    4:23 wrong formula, you forgot the y and replaced it by x

  • @kamalb3326
    @kamalb3326 6 лет назад

    Superb explanation. Thank u

  • @panostzakis6925
    @panostzakis6925 Год назад

    .Thanks for your help .Im appreciate for your time ..But maybe there is a loose in Euclidean Distance [(x,y),(a,b)]=root of(x-a)^2+......(y-b)^2 and not ...(x-b)^2.as my point of view!!

  • @mprasad3661
    @mprasad3661 5 лет назад

    Great explanation madam

  • @sagardnyane3813
    @sagardnyane3813 5 лет назад +1

    thank to explanation for K-means and can you share any pdf file realted to K-means.

  • @vigneshwaravr3283
    @vigneshwaravr3283 3 года назад

    Fully explained

  • @aiswarya3461
    @aiswarya3461 5 лет назад

    Thank you ma'am for a great lesson !!

  • @FunmiOg
    @FunmiOg 6 лет назад

    Thank you so much ma. Very helpful video.

  • @annaet7769
    @annaet7769 5 лет назад +1

    Thank you😊

  • @tahaali01
    @tahaali01 5 лет назад +1

    apne galat centroid nikala hai

  • @vashmchannel7266
    @vashmchannel7266 4 года назад +1

    Love's urs lecture

  • @mithrasenthil7516
    @mithrasenthil7516 5 лет назад +1

    Good one mam, Thank you

  • @x2diaries506
    @x2diaries506 6 лет назад

    @Anuradha Bhatia can you help me to apply K-Mean Clustering in Localization using VLC

  • @ismailkarnanokung4753
    @ismailkarnanokung4753 6 лет назад

    thanks for everything this video is very instructive...

  • @NandDulalDas2810
    @NandDulalDas2810 6 лет назад +2

    simply wonderful/easy to understand

  • @rajanisingh3128
    @rajanisingh3128 6 лет назад

    Great ma'am.

  • @bajwayt1
    @bajwayt1 4 года назад

    What is cross minimization in clustering

  • @shalinisoni4257
    @shalinisoni4257 4 года назад

    Mem Cosine Similarity Ka Sum Ka Examples How to find.... Ek Vedio Bnaake Rakho.... Plz

  • @Uma7473
    @Uma7473 5 лет назад +1

    Thank you 👏👏👏🙏👼