Machine Learning Tutorial Python - 13: K Means Clustering Algorithm

Поделиться
HTML-код
  • Опубликовано: 5 окт 2024

Комментарии • 601

  • @codebasics
    @codebasics  2 года назад +12

    Check out our premium machine learning course with 2 Industry projects: codebasics.io/courses/machine-learning-for-data-science-beginners-to-advanced

    • @vitaltopics316
      @vitaltopics316 2 года назад

      why do we not need to use centroids for the iris dataset?

  • @arrahman100
    @arrahman100 5 лет назад +198

    You make Machine Learning so easy to understand. I would say you are a SAVER for the people who are struggling to understand different ML algorithums. Thank you so much. please if possible put some content on NLP.

  • @shauryabhatnagar71
    @shauryabhatnagar71 3 года назад +34

    You are probably one of the best teachers I have come across. Thank you so much!

  • @wyphonema4024
    @wyphonema4024 4 года назад +4

    my grad school professor explains this very badly. You explain things very well with patience, you are the definition of a good teacher

  • @hshrestha2811
    @hshrestha2811 4 года назад +69

    Summarizing the algorithm for K Means clustering based on this video:
    1. Start with k centroids by putting them at random points here k =2
    2. Compute distance of every point from centroid and cluster them accordingly
    3. Adjust centroid so they become center of gravity of given cluster
    4. Again recluster every point based on distance with adjusted centroid
    5. Reiterate until data points stop changing cluster
    6. Again adjust centroids

    • @adipurnomo5683
      @adipurnomo5683 3 года назад +1

      Noted!

    • @mychanneltest8623
      @mychanneltest8623 Год назад +1

      7. done and put the ruler away

    • @ganeshn9464
      @ganeshn9464 9 месяцев назад

      Please correct me if I am wrong -
      2. Compute distance of every point from centroid and cluster them accordingly - This entire process is "built" inside the KMeans.fit_predict(). Correct ?

  • @AltafAnsari-tf9nl
    @AltafAnsari-tf9nl 3 года назад +7

    I have started loving machine learning due to the simplicity of explanations.

  • @qas4273
    @qas4273 4 года назад +14

    It's a blessing to be able to finally say that I can learn ML, thanks to you :). I have used 'HUE' from seaborn instead of writing plt.scatter for every group of the cluster. sns.scatterplot(df['Age'], df['Income($)'], hue = df['Cluster'])

    • @manishsharma2211
      @manishsharma2211 4 года назад +2

      Thanks for this 👍

    • @codebasics
      @codebasics  3 месяца назад +1

      Yes, with seaborn you can do it in one line. Thanks for posting this comment 🙏🏼🙏🏼

  • @cindinishimoto9528
    @cindinishimoto9528 4 года назад +57

    All the ML series is so exciting. I'm learning and having fun during the quarantine in Brazil, SP.
    Thanks, @codebasics

  • @nilupulperera
    @nilupulperera 4 года назад +8

    What a beautiful explanation. The beauty of Data Science is shown in this video in a remarkable way.
    The exercise is really beautiful!
    Thank you very much, Sir.

    • @codebasics
      @codebasics  4 года назад +2

      Dear Nilpul, Thanks for the comment. Keep learning. 😂

  • @beansgoya
    @beansgoya 5 лет назад +5

    Fantastic explanation. I like the way you showed us what happens if you don’t scale your features. You also waited for the perfect opportunity to show why we need to use the elbow method.

    • @codebasics
      @codebasics  5 лет назад +2

      Thanks for your feedback kin. 👍😊 Feedback like this helps me continue do the good things and also any critical feedback is welcome as well as it helps me improve 👍

  • @sidduhedaginal
    @sidduhedaginal 4 года назад +21

    Sir, you made Machine Learners life easy....amazing explanation that ever seen before and by Elbow technique we got K=3 for iris dateset.

    • @codebasics
      @codebasics  4 года назад +4

      Thanks for you kind words and I am happy you liked it 😊

    • @shubhangiagrawal336
      @shubhangiagrawal336 4 года назад +4

      @@codebasics Sir can you please make one video on knn algorithm? I need it so badly. Thankyou

  • @amilcarc.dasilva5665
    @amilcarc.dasilva5665 5 лет назад +30

    Excellent tutorial. This is highly recommended to watch. Thanks a lot Sir, I find it helpful in my project work....I really appreciate. You have done great work to help others. Keep up doing this great work.

  • @aliouahli3185
    @aliouahli3185 2 года назад

    you can't find a video and not watch all playlists , im so grateful to you , thank you sir!

  • @Rus1310CMRS
    @Rus1310CMRS Год назад

    I looked 5 min at start, and your teaching style for ML is spot on, better than the IIT professors. I am enjoying ML algo now. Thanks.

  • @Mukeshsingh-zn9rq
    @Mukeshsingh-zn9rq 4 года назад +1

    was trying out tons of videos trying to understand the basics of ML, you made it so simple and quick.
    Loved it!!

  • @andreabrunelli2030
    @andreabrunelli2030 3 года назад

    Dear CodeBasics, your tutorials are way better than all the classes of the Master in AI I have just completed. Thank you very much!

    • @codebasics
      @codebasics  3 года назад +1

      Glad it helped you 😊

  • @Kingsohio
    @Kingsohio 2 года назад

    This is a great quick refresher for those with the basic knowledge of ML clustering algorithms

  • @namansinghal9090
    @namansinghal9090 4 года назад +5

    I must say!! you are making life alot easier for all of us!!! Thanks a lot mannn.. Your efforts are really appreciated. Keep up hard work.

  • @bhaskarg8438
    @bhaskarg8438 2 года назад +1

    your explanation is clear and clarity in the content.. and knowledge sharing to needed Data Science community is Nobel... thankyou... 🙏

  • @praveenjagarlapudi7891
    @praveenjagarlapudi7891 Год назад

    I have got the real clarity after watching your video, This is a great help. Thank you for all the videos.

  • @commercial3750
    @commercial3750 Год назад

    This was awesome! I can't believe I learned how to do K-means clustering in just a few hours. Your explanations are clear and concise. Thank you so much!

  • @thirugnanamselvam2304
    @thirugnanamselvam2304 5 лет назад +5

    All your videos are clear and good. Congrats for that. Can you please make a video for recommender systems and NLP

    • @codebasics
      @codebasics  5 лет назад +3

      Thanks thirugnanam, point noted!

  • @raom2127
    @raom2127 3 года назад

    All ML series Iam following, Iam following to your vedios nicely explained you explaination ans approach is awasome now I stopped seeing all news social media network just following up your vedios only.....your vedios has magentic power........................

  • @meghnasachdeva3174
    @meghnasachdeva3174 4 года назад +1

    the way yoy explain is commendable making it so easy even fote beginners....thankyou so much for your efforts sir..really

  • @NikitaSharma-bs4gg
    @NikitaSharma-bs4gg 2 года назад

    I searched for 3-4 days and I only got the plotting after seeing your video- Thanks a lot

  • @jayagrawal3481
    @jayagrawal3481 9 месяцев назад

    really liking the course i cant believe i already watched more than 50 videos from a playlist only 50 more to go

  • @skyblue021
    @skyblue021 3 года назад +1

    This is the best-explained K-means on the internet - period. Thank you!

  • @pandaparas8500
    @pandaparas8500 5 лет назад +1

    You make every topic so easy to understand. Long time no video, we miss your videos. Please upload video regularly. Once again good to see your videos. Thank u

    • @codebasics
      @codebasics  5 лет назад +1

      Panda, thanks for your appreciation. I am going through some health issues and that's the reason not able to upload the content on regular basis. I will start uploading once I recover... 🙂

    • @pandaparas8500
      @pandaparas8500 5 лет назад +1

      @@codebasics oh sorry i m not aware of it.Its ok. Take your time and concentrate on full health recovery . Get well soon sir. Take care. Once again thank u for everything 😊

  • @maruthiprasad8184
    @maruthiprasad8184 2 года назад

    Optimal value for iris data set got 3. Thank you very much for simple & great explanation.

  • @shubhamwaingade4144
    @shubhamwaingade4144 2 года назад

    It was good, simple, informative, no errors. I guess these are enough clusters to define the quality of this video.

  • @LamNguyen-nm1id
    @LamNguyen-nm1id Год назад +1

    both StandardScaler and MinMaxScaler worked for petal perfectly. The elbow had me at 2 and 3 but 3 seems to have a better variance than 2 upon plotting it in a scatter graph

  • @santetzuken8903
    @santetzuken8903 4 года назад

    These videos are more helpful than all of the classes I took in my university combined.

    • @codebasics
      @codebasics  4 года назад +2

      I am happy this was helpful to you.

  • @doyoonkim4187
    @doyoonkim4187 2 года назад

    The best ML lesson I've ever heard

  • @poojabehera8675
    @poojabehera8675 4 года назад +1

    Thank you so so much for making Machine learning so easy to understand by the series sir. The more i look for any ML content of yours, the more i wish to go on for the entire series with such clear understanding. with passion for the subject & curiosity to search had been gone through so many tutorials so far but codebasics is the best learning source for beginners like me and would definitely recommend this to freshers/ beginners for a clear base understanding for data science.... MUCH APPRECIATE

    • @codebasics
      @codebasics  4 года назад +1

      Thanks Pooja for your kind words. This means a lot to me and gives me a fuel to continue my work. If you like my videos would you share this series and deep learning series on your facebook page, linked in or watsapp? That way maximum people can benefit from it.

    • @sajjadkhan-oc2bk
      @sajjadkhan-oc2bk 3 года назад

      @@codebasics Thanks for your efforts! Stay blessed...

  • @liliyalopez8998
    @liliyalopez8998 2 года назад

    I love how you explained the material in plain language. You made it very easy to follow and understand ❤

  • @saratbabum1303
    @saratbabum1303 4 года назад +1

    Nice way of explaining the complicated concept with an example. Great Job !!! Thanks a lot

  • @yashchauhan5710
    @yashchauhan5710 5 лет назад +3

    Sir I'm assuming that's this is the end of ur series on ML... Ah can u provide us further topics to explore just a road plan for the path ahead
    Again really helpful and well explained videos so far

    • @codebasics
      @codebasics  5 лет назад +3

      Sure yash. My next set of topics is going to be on deep learning, I've already uploaded 2 tutorials on that and will continue to upload more in future.

    • @rajushahi5089
      @rajushahi5089 4 года назад

      Sir , how to we cluster if need to cluster weather data set include min tem, max tem, wind speed, humidity, rain etc.

  • @akshrags-mindmaze3407
    @akshrags-mindmaze3407 Год назад +1

    7:25 Brilliant Explaination!

  • @flyingsalmon
    @flyingsalmon 2 года назад

    Fantastic coverage...you covered the basics, then talked about the reasons and potential challenge with real-world data, and showed some amazing methods to visualize differently. Thank you for your continued contribution to learning and sharing with the community. This kind of tutorials is what will make newcomers gravitate toward ML and be glad to learn.

    • @codebasics
      @codebasics  2 года назад +1

      Glad you enjoyed it!

    • @flyingsalmon
      @flyingsalmon 2 года назад

      @@codebasics Absolutely. I do have a follow-up question. I got lost after scaling the values...let me explain. I have a dataset of tips($) and total restaurant bill($) per day per group of customers. I got 3 clusters and they look good after rescaling the bill and tips via MinMax (my x-axis is the bill for food, and y-axis is tips $). But after clustering, I can't tell how to map the x-axis and y-axis rescaled values (which are 0..1 floats) to actual dollar values that are in the dataset. I need to know how the clusters map to real dataset. Is there a way to do that logically? Would really appreciate your input. TY!

  • @86Plum
    @86Plum 4 года назад +1

    Amazing video and explanantion! Just started learning about Machine Learning algorithms and this is incredibly helpful. Thank you!

  • @AlonAvramson
    @AlonAvramson 3 года назад +1

    Thank you! really enjoyed this session. I tried both Petal and Sepal and it went very well.

  • @prakashdahal2560
    @prakashdahal2560 4 года назад +1

    One of the most useful tutorial I have ever seen

  • @beansgoya
    @beansgoya 5 лет назад +3

    not sure if anyone else had this problem but at 16:00, i had to add an extra set of brackets when i did this exercise. maybe my python is outdated or something.
    scaler = MinMaxScaler()
    scaler.fit(df[['Income($)']])
    df['Income($)'] = scaler.transform(df[['Income($)']])
    df

    • @codebasics
      @codebasics  5 лет назад

      Kin, what you are doing is correct. In tutorial I got warning but ignored it but yea it takes 2D array as input. Here is my correct code: github.com/codebasics/py/blob/master/ML/13_kmeans/13_kmeans_tutorial.ipynb

    • @kale_hyder
      @kale_hyder 4 года назад

      Kin Cheng thank you so much

  • @praveensevenhills
    @praveensevenhills Год назад

    great explanation taking from a small example and going in elaborative way covering whole concept in a nutshell very well explained

  • @tejas4054
    @tejas4054 Год назад

    This is best video of all ml videos on youtube

  • @abhijitkundargi150
    @abhijitkundargi150 4 года назад +2

    Amazed!!! Understood the concept in just one go! Hands on is by far the best I ever saw.. Keep up the good work Sir, please post more videos majorly used algorithms. Thank you.

  • @sushiltry
    @sushiltry 3 года назад +1

    Superb explanation but I feel elbow plot is to be drawn initially so it is more clear about K

  • @zainishah5540
    @zainishah5540 2 года назад

    Oh man just wow wow wow. You made my day what a lesson it was. Absolutely you nailed it

  • @mohankrishna1680
    @mohankrishna1680 2 года назад

    huge respect to your hardwork , vedios increasing learning curiosity 👏👏👏👏

  • @BhuvaneshSrivastava
    @BhuvaneshSrivastava 4 года назад +2

    This is great video.. just one thing you missed was: On what kind of dataset we should not use K-means and how to identify them.
    If this would have been included them this would be 100% complete.

    • @tammy4994
      @tammy4994 4 года назад +1

      You can use any kind of data but just the data preprocessing steps would be different

  • @sunilsharanappa7721
    @sunilsharanappa7721 4 года назад +1

    You are awesome, you make the complex things simplified.Please keep up the good work.

  • @vikrampruthvi5
    @vikrampruthvi5 5 лет назад +2

    Thanks a ton... I love your simplicity in explanation and perfection in explaining hurdles that everyone might face. Please keep doing this great work.

  • @igorbuzov3778
    @igorbuzov3778 4 года назад +1

    Going to university, studying, learning, reading from books
    And then you find some random guy form (I assume India) who exlans it in simple way.in less than 30 minutes... Respect!

    • @codebasics
      @codebasics  4 года назад +3

      I am glad Igor that this was helpful to you 😊👍

  • @heaven3279
    @heaven3279 3 года назад

    Excellent quick and short explanation of K-means. Appreciate it

  • @abhishek4985
    @abhishek4985 3 года назад

    Saw so many videos, but this one video helped the most. Thanks!

  • @shashankkkk
    @shashankkkk 3 года назад

    man you shouldn't have put these playlists on youtube for free... these are gems... people should pay to learn these gems for at least some small amount.. You are awesome sir ... The things which I wasn't able to learn you taught me in that... hats off sir... hope we will meet one day

    • @codebasics
      @codebasics  3 года назад +1

      Thanks for your appreciation, sure when I am in India I will plan for a Meetup

  • @spacetimevideostudio109
    @spacetimevideostudio109 2 года назад

    Thank you so much Dhaval for this video, with my elbow technique I got k=3.
    Thanks

  • @shivamtyagi5614
    @shivamtyagi5614 4 года назад

    Exercise done. viewing the initial plot n_clusters seems equal to 2, bt using ellbow method clears it to use n_clusters =3,,,,,Enjoying this holiday!!!!!!!!!!

  • @henyermogollon679
    @henyermogollon679 2 года назад

    God bless you. You are ML Guru! I love your content. very easy to understand the basics of everything.

  • @abhinavsharma6633
    @abhinavsharma6633 3 года назад +1

    Really Appreciate your lectures 🙏🙏
    Btw, the value of K by elbow technique you taught is 3.

  • @Mushsayer
    @Mushsayer 3 года назад

    Thanks a lot for the video. You taught the K-Means Clustering
    in 10 mins!

  • @adisaolaitan5475
    @adisaolaitan5475 5 лет назад +2

    Thanks so much for your easy-to-understand tutorials. You are a blessing. God bless you!

  • @thaanathaana4522
    @thaanathaana4522 8 месяцев назад

    Really useful videos .. i got too many doubts in machine learning.. came up with one video became a subscriber now .. thanks brother for clear explanation

  • @techienomadiso8970
    @techienomadiso8970 3 года назад +1

    Wow. You are a great man. You've made it soo simple to understand. Thank you Sir. 🔥🔥🔥

  • @dineshpabbi7005
    @dineshpabbi7005 5 лет назад +2

    Sir , I found your channel just a week ago and i would just like to thank you so much for such a wonderful content ! Please continue the ML series .. I hope you also make tutorial of Neural Networks and their application!

  • @kmnm9463
    @kmnm9463 4 года назад

    Hi Dhaval ji - excellent video on KMC. Very precise in presenting, Particularly liked the cluster_centers_ and inertia_ concepts. The final elbow plot with for loop being the starting point was unparalleled in clarity. Thanks a lot

  • @Sheblah1
    @Sheblah1 Год назад

    this is a perfect introduction to k-means thank you for making this video👌

  • @visakhv1805
    @visakhv1805 5 лет назад +1

    Hope your health is fine now. When you get time, please include some contents on Principal Component Analysis. Thank you very much for your great effort.

    • @codebasics
      @codebasics  5 лет назад +1

      Hey visakh, PCA is in my Todo list. So sure I will make tutorial on that in future

  • @AntonioGarcia-ck5hy
    @AntonioGarcia-ck5hy 2 года назад

    Thanks a lot for the tutorial video, @codebasics. You are an excellent teacher.

  • @ousmanealamakaba3135
    @ousmanealamakaba3135 2 года назад

    you are very strong .thank you so much for making this class easy

  • @Suigeneris44
    @Suigeneris44 5 лет назад

    You are really good! I would be happy to pay for such clear code lectures. Very well articulated! Keep it up!

    • @codebasics
      @codebasics  5 лет назад

      Suigeneris44, I appreciate your comment buddy. I am glad you found this to be useful :)

  • @pulkitgupta1946
    @pulkitgupta1946 5 лет назад +1

    sir bas aap please video upload karte rehna . aapki videos se bahut help milti hai

    • @codebasics
      @codebasics  5 лет назад +2

      Sure pulkit. Me koshish yahi karunga ke jyada se jyada achhe video produce karta rahu on machine learning.

    • @pulkitgupta1946
      @pulkitgupta1946 5 лет назад

      Thanks

  • @Saikumer-g4p
    @Saikumer-g4p 5 месяцев назад

    Awesome explanation from you sir,I think this is best tutorial for k means clusters in the youtube . I tried seen lot of videos in the RUclips regrading on the this topic . But this video give me some boost to create the amazing the model in ml tq for all this sir .
    My feedback is : you shoud give the some more example on topics that will be help to us to make beautiful things from this

  • @prasannavi1911
    @prasannavi1911 4 года назад +1

    You are awesome. You made me think ML is not complex to learn.

    • @codebasics
      @codebasics  4 года назад

      Indeed, ML is not as complex as people think it is !

    • @prasannavi1911
      @prasannavi1911 4 года назад

      codebasics thank you. Post more videos your way to explaining things well understood.

  • @simranthiara6616
    @simranthiara6616 3 года назад

    Best channel for explanations on ML algorithms. Thank you so much :) , definitely subscribed .

  • @spikeydude114
    @spikeydude114 2 года назад +1

    Great example! Very clear and straightforward explanation. Would you have any examples for time series data?

  • @sufyanhamid9560
    @sufyanhamid9560 4 года назад

    It is very helpful to understand the working of algorithms and code also,I wish to learn your complete series of m.l.

    • @codebasics
      @codebasics  4 года назад +1

      Sufiyan I am glad you liked it. I am going to add many more tutorials in ML series. Stay tuned buddy 😎

  • @Ankurkumar14680
    @Ankurkumar14680 5 лет назад +3

    Another excellent video Sir, it is difficult to wait to see your videos on Neural Networks...as you mentioned in the comments below. thanks a ton for your efforts

  • @zerostudy7508
    @zerostudy7508 5 лет назад +1

    Cool other one to my list... i love cluster
    Mr. @codebasics
    in Exercise well i also found that we can (km.predict([[7,3]])), for which cluster category label of
    petal length =7 and petal width= 3
    but labels of clusters gave is not constant, some time they give clusters as unique labels [0 1 2] or [2 0 1] but still the same clustering though only different Labels name but same result.

  • @AjaySharma-jv6qn
    @AjaySharma-jv6qn 2 года назад

    You make things quite simple. Please keep posting..

  • @amrelkholy6662
    @amrelkholy6662 3 года назад

    you are amazing, I like your simplicity in delivering the information, thank you very much

  • @aviralgupta9364
    @aviralgupta9364 4 года назад +1

    Fantastic tutorial !!! .Thanks a lot sir ... It gives best explanation

  • @monfreign
    @monfreign 4 года назад

    been researching about these regressions and other methods, have always found myself in your vids, you sir earned my sub :)

  • @nishiraju6359
    @nishiraju6359 4 года назад

    The way explained, it really understandable... Keep uploading more n more videos on ML .. with Case Study .. Thanks in Advance

  • @bishwadeepsikder3018
    @bishwadeepsikder3018 Год назад

    The best explanation ever... Thank you so much

  • @sharathkrishnan1558
    @sharathkrishnan1558 3 года назад

    Excellent, amazing . You make it so easy. Thank you sir

  • @kalyankrishna5902
    @kalyankrishna5902 4 года назад

    Sir plz upload KNN algorithm video I followed your videos and learned so much of knowledge from ur videos.....tq very much sir

  • @jiatianbu6761
    @jiatianbu6761 3 года назад

    Make Machine Learning so easy to understand. Thx

  • @mapa5000
    @mapa5000 Год назад

    Outstanding explanation my friend !! Thank you from Houston

  • @amandal8404
    @amandal8404 3 года назад

    Wow, great intro to cluster analysis in Python. Thank you so much, awesome teaching as always!

  • @kashishgakkar3453
    @kashishgakkar3453 5 лет назад

    Supperrrrrrbbbb....These tutorials just made my life easy.
    It would be great if you can create different tutorials for the mathematical calculations for each of these models that we have learnt in this playlist.
    Keep Growing and keep teaching us!!

    • @codebasics
      @codebasics  5 лет назад +1

      Sure I am planning to extend this series with more algorithms and ML techniques. Stay tuned.

  • @shantanuraj7086
    @shantanuraj7086 2 года назад

    Amazing video. Creative, resourceful and excellent preparation. Keep posting more such videos.

  • @franky0226
    @franky0226 4 года назад

    thank you so much !
    congrats for 100K !!

  • @dathscom
    @dathscom Год назад

    You are awesome man, thanks a lot. Keep on sharing your significant educative videos, please.

  • @anmoldeep0123
    @anmoldeep0123 7 месяцев назад

    What a fantastic way to expain this algorithm ? What are the practical usecases of this algo ?

  • @ericwr4965
    @ericwr4965 4 года назад +4

    This was brilliant and I appreciate the explanation of the code.
    Question, once you get the clusters identified as you took age and income, what would you explain?
    Would you need the table as well to discuss the pattern as a supplement as otherwise we would just say we have three clusters, but what do they mean?

  • @R3NAN3224
    @R3NAN3224 3 года назад +5

    If you have more than 2 atributes use.. df([[ 'name', 'example]] >X ,[[ 'name', 'example]] >Y )

    • @paulcurious2324
      @paulcurious2324 3 года назад +1

      i dont get this please could you explain in detail

    • @R3NAN3224
      @R3NAN3224 3 года назад

      @@paulcurious2324 @Paul Curious hi, if you have 3 atributes.. For example: color, leaf type and flower. You need make a code using 3 separated arrays.. Inside df( [ [ 'color','green'] ],[ [ 'Leaf','small'] ] ,[ ['flower','white']] )

    • @paulcurious2324
      @paulcurious2324 3 года назад +1

      @@R3NAN3224 it says df is not a callable when i do that

    • @paulcurious2324
      @paulcurious2324 3 года назад +1

      @@R3NAN3224 can u send a link to an example, maybe github or repl

    • @R3NAN3224
      @R3NAN3224 3 года назад +1

      @@paulcurious2324 see in 8:00 ... df is a name associate of your database. Paste your code here to i see you database name.

  • @bumohamed624
    @bumohamed624 Год назад

    This material is brilliant, divided into theory , tutorial and exercise however I would suggest to focus on the objective which is providing solution for the problem and predict characteristic by using age and income . The predication accuracy was not been shown and do we achieve to predict correctly or not , was totally grayed end of the video . I would like to receive your further elaboration on the objective as after such effort there should be solution ,please communicate

  • @atifmalayalam4007
    @atifmalayalam4007 5 месяцев назад +1

    Very clear. Thanks!

  • @suhaalam2127
    @suhaalam2127 2 года назад

    Thank you so much for such well explained tutorials.