How Random Forest Work|How Random Forest Algorithm Works|Random Forest Machine Learning

Поделиться
HTML-код
  • Опубликовано: 18 ноя 2024

Комментарии • 114

  • @mosama22
    @mosama22 3 года назад +24

    I'm studying Data Science at MIT, you really can't imagine Aman how much "Unfold Data Science" is helping me, and a couple more channels, before I start any topic I like to tackle it first or just take a general idea, and you can't imagine how much your videos helped! Short, concise, and to the point! Thank you Aman 🙂

  • @Sagar_Tachtode_777
    @Sagar_Tachtode_777 4 года назад +8

    You just nail the big concepts with a simple example.
    Thank you. Keep it UP.
    Grow fast and furious!!

  • @vishalrai2859
    @vishalrai2859 3 года назад +1

    wow what a teacher you are exceptional
    i think no one on youtube can teach like you in so easy and lucid way
    thank you sir

  • @_proton5
    @_proton5 3 года назад +3

    Excellent explanation in simple English. Keep up the good work Aman! Thanks!

  • @prakharagrawal4011
    @prakharagrawal4011 3 года назад +1

    Beauty of this lecture is very easy and elegant explanation in simple English. deadly combination.. Thank you Aman

  • @eyobsolomon4663
    @eyobsolomon4663 3 года назад +1

    #Interesting

  • @nooreldali7432
    @nooreldali7432 Год назад

    wow the example about salary in decission tree was sooo good! hats off

  • @kalam_indian
    @kalam_indian 3 года назад +1

    Please don't dislike,
    he is the bestest trainer,
    this shows that knowledge is power
    and maximum other RUclips videos are replica of one another making small modifications with no proper concept
    keep it up, you are the best

  • @uchennanwosu5327
    @uchennanwosu5327 3 года назад +1

    Bravo! Excellent exposition.

  • @askpioneer
    @askpioneer 2 года назад

    I like your simplicity in teaching , you made topics simple. great job aman.

  • @santhoshkumar-dd6xq
    @santhoshkumar-dd6xq 3 года назад +1

    Excellent explanation in simple terms

  • @sangrammishra9923
    @sangrammishra9923 3 года назад +1

    excellent content

  • @sunilsharanappa7721
    @sunilsharanappa7721 4 года назад +1

    Superb explanation , Keep going and growing. Thanks a lot.

  • @GopiKumar-ny3xx
    @GopiKumar-ny3xx 4 года назад

    Useful information....nice presentation

  • @Birdsneverfly
    @Birdsneverfly 2 года назад +1

    Excellent explanation. Esp the details on what happens when on feature is not selected and how it helps other features to vote in. Probably this also leads to feature importance too.

  • @syedkamran6249
    @syedkamran6249 4 года назад +1

    Well explained sir

  • @spicytuna08
    @spicytuna08 Год назад

    wpw!!! what a gift in teaching!!!

  • @tejagunupudi5318
    @tejagunupudi5318 2 года назад

    This is pretty good sir. got a lot of input from this video

  • @muthierry1
    @muthierry1 3 года назад

    Great explanations .. thank you very Much.. Sir

  • @imranaziz5856
    @imranaziz5856 2 года назад

    Excellent presentation and content in a simplified way and shortest time ! Kudos to you. Thank you

  • @xendu-d9v
    @xendu-d9v 2 года назад

    my great teacher, thanks

  • @smegala3815
    @smegala3815 2 года назад +1

    Thank you

  • @soheilaahmadi4807
    @soheilaahmadi4807 2 года назад

    very great and clear

  • @sudhavenugopal3726
    @sudhavenugopal3726 4 года назад +1

    Dear Aman, thank you for your excellent explaination. As ai am a slow learner, I have a doubt from 11.25 mins. Is that the did advantages of Decision tree or Random Forest, because your video is the only source of my learning journey

    • @UnfoldDataScience
      @UnfoldDataScience  4 года назад

      Disadvantage of Decision tree - Overfitting
      Disadvantage of Random Forest - Resource intensive algorithm

  • @nikhildesai2460
    @nikhildesai2460 2 года назад

    Great Explanations Aman.

  • @azingo2313
    @azingo2313 Год назад

    Very good explanation ❤

  • @srinivaskrnagar4029
    @srinivaskrnagar4029 3 года назад

    Super lecture, easy to understand, keep up the good work bro...

  • @Gilco333
    @Gilco333 4 года назад

    You have explained the subject very well!!

  • @sandipansarkar9211
    @sandipansarkar9211 3 года назад

    finished watching

  • @adieu_bae
    @adieu_bae Год назад

    You are a great teacher!

  • @alkashie9174
    @alkashie9174 4 года назад

    great video , simple and easy to understand , Thank you sir !

  • @dimplechutani2768
    @dimplechutani2768 Год назад

    Hi Aman . This is really great to see all the concepts in easy ,manner . Thanks for uploading it . I have a quick question , when we are testing our dataset on different decision trees then testing dataset will have all the N Columns and decision trees will have n1,n2,n3 columns then how it works ?

    • @UnfoldDataScience
      @UnfoldDataScience  Год назад

      Very good question - its not a parametric model so it does not matter.

  • @kirtisardana8479
    @kirtisardana8479 4 года назад +1

    Hi Aman ...Till now your all videos are in order if following playlist from older to newer manner . Looks like now decision tree video should be part of this playlist after explaining ensemble and before random forest .... what do you think 🤔?

    • @UnfoldDataScience
      @UnfoldDataScience  4 года назад

      Thanks for feedback Kirti. Let me check if I can rearrange. Happy learning. tc

  • @amoldhumane8277
    @amoldhumane8277 2 года назад

    Thank you Aman!

  • @bangarrajumuppidu8354
    @bangarrajumuppidu8354 3 года назад

    well explanation!!

  • @archanamohapatra7589
    @archanamohapatra7589 3 года назад

    Well explained, Thank you👍

  • @mehdimediouni9659
    @mehdimediouni9659 4 года назад

    very helpful, keep up the good work !

  • @sadhnarai8757
    @sadhnarai8757 4 года назад +1

    Good ..aman

  • @nikhilgupta4859
    @nikhilgupta4859 3 года назад +1

    Hi Aman,
    While you say output of random forest is majority(suppose Y). Does that mean for all 300 inputs the prediction would be Y now. and for all test data the prediction would be Y only???

    • @itsarsh4421
      @itsarsh4421 3 года назад

      Its not like that you taking it wrong. Not all 300 data points will predict Yes. It shuffle data point row and column wise(not all 300 data points but 2/3 of the data). Its like if row no.1 given in bag1 and 2 other bags also with the corresponding to other feature. And bag1 giving output "Yes" and other 2 bags giving output "No" . Then it play a democratic rule which is every individual have same weightage and right to vote.
      So in this case the output will be "NO"

  • @arko1383
    @arko1383 2 года назад

    Hi Aman, thanks for your videos
    These are really informative and helpful
    I have one question I was asked by an interviewer
    When to use random forest instead of xgboost ?

    • @UnfoldDataScience
      @UnfoldDataScience  2 года назад +1

      Xgboost needs more server capacity on large data, random forest you get variablr importance, many more points to consider as well. This is a short answer

  • @vinijcobatgamildotcom
    @vinijcobatgamildotcom 4 года назад +1

    Can you make videos on linear regression and logistics regression.

    • @UnfoldDataScience
      @UnfoldDataScience  4 года назад

      Hi Oyster, These videos are already there on my channel. Please find link below:
      ruclips.net/video/8PFt4Jin7B0/видео.html

  • @rusiraliyanage6643
    @rusiraliyanage6643 2 года назад

    Dear sir, are there are two methods of constructing random forest algorithms ?

  • @jailata3822
    @jailata3822 2 года назад

    Sir , very nice video. Do you also take paid course?

  • @mehtashyam5196
    @mehtashyam5196 3 года назад +1

    Sir how can we decide which catagory is to be taken as the root node of any decision tree when more than 2 catagory is given in data

  • @gangulreddy7918
    @gangulreddy7918 4 года назад

    Thank u

  • @sagarmestry5514
    @sagarmestry5514 4 года назад

    Hello Aman can you please explain what it the difference b/w random forest classifier & extra tree classifier?

    • @UnfoldDataScience
      @UnfoldDataScience  4 года назад

      Hi Sager, for each feature , a random value is selected for the split in case of extra tree. I will explain in more detail in a video. Thanks you

  • @nidhichourasiya2743
    @nidhichourasiya2743 3 года назад +1

    I have a que. if we have 1000 of record data nd we build random forest and n_estimaters=10,then in each decision tree how many record will get train

    • @UnfoldDataScience
      @UnfoldDataScience  3 года назад +1

      Good question Nidhi.
      There are two important parameters. One is "bootstrap" And other is "max_sample". For taking subset of data in each tree, you must say " Bootstrap " = True. By default it's True in python sklearn.
      Coming to "max_sample", if you say " none"(default), all records go in all trees.
      If you say a integer, those many rows.
      If you say a decimal, that percent of total no of rows.

    • @nidhichourasiya2743
      @nidhichourasiya2743 3 года назад

      @@UnfoldDataScience tnx sir

  • @ravanshyam7653
    @ravanshyam7653 3 года назад

    for example if there are 500 decesion tress then it predicts 250 1 s and 250 0 s what the random forest will declares sir??

  • @askpioneer
    @askpioneer 2 года назад

    I have one doubt, in which scenario i choose decision tree ML over random forest, because it seems random forest is the best , then why should we use Decision Tree classifier

    • @UnfoldDataScience
      @UnfoldDataScience  2 года назад

      Normally we use random forest or boosting directly. No decision tree

  • @seetharaman5262
    @seetharaman5262 Год назад

    if i have a more than 2 classes what to do

  • @rafibasha4145
    @rafibasha4145 2 года назад

    How to choose number of samples

  • @praveensingh9688
    @praveensingh9688 3 года назад

    sir, just a small doubt what are these decision trees in random forest classifier made of like do they have other classifier such as ann, logistic regression, svm and other types in them? is it so or something else

  • @user-px5of1bi2g
    @user-px5of1bi2g 2 года назад

    Is random forest only for predicting? I’m tying to see which features affect the income of taxi drivers in NYC. Can I use random forest for that?

    • @tpennyhealth5861
      @tpennyhealth5861 7 месяцев назад

      Ml generally is for predicting. Type of car, age of car, hours spent on the wheel, time of the day driver likes to work, economy of the city, and also drivers rating from other users

  • @tpennyhealth5861
    @tpennyhealth5861 7 месяцев назад

    I don't think the sample has to have less observations. We sample N times for N rows of data

  • @krishna7440
    @krishna7440 3 года назад

    Sir for binary classification
    if no of tree are even number say we have 6 tree out of which 3 is yes or 1 and rest 3 is 0 or No
    Then what should b output of our Random forest method yes or No or else??

    • @usbabu
      @usbabu 3 года назад

      I think it just desides by tossing a coin🤔.

  • @shekharkumar1902
    @shekharkumar1902 2 года назад

    What is pasting?

  • @ajaykushwaha-je6mw
    @ajaykushwaha-je6mw 3 года назад

    There is no doubt that RF is much better than Decision Tree, then why still Decision Tree still in use ?

  • @HtS643KyS6555GxQ3edA
    @HtS643KyS6555GxQ3edA 3 года назад

    Is there a proof of random forest’s accuracy as an algorithm? Thanks

  • @shivanshjayara6372
    @shivanshjayara6372 4 года назад

    @10:22 You said that salary may not be part of further decision tree...(may not ) but what if salary is the only feature which has less entropy and high information gain. If it is so then i think in every decision tree root node will be salary only.....
    Or if it is taking different different rows and columns then i think it may happen that salary may not be always selected as a root node?
    i think i have question you also and answered my question by my own but you tell if im wrong then correct me please

    • @UnfoldDataScience
      @UnfoldDataScience  4 года назад

      Hi Shivansh, All the columns will not be selected in every tree. Hence, its possible that "Salary" is not part of few trees hence there is no question of it being root node.

    • @shivanshsingh5555
      @shivanshsingh5555 4 года назад +1

      Ok u mean to say that in randome forest all the columns are not get selected at once for all the decision tree...columns gets selected randomly?

    • @UnfoldDataScience
      @UnfoldDataScience  4 года назад

      Yes absolutely.

  • @geekyprogrammer4831
    @geekyprogrammer4831 3 года назад +1

    are you from Rajistan?

    • @UnfoldDataScience
      @UnfoldDataScience  3 года назад

      No Sir.

    • @geekyprogrammer4831
      @geekyprogrammer4831 3 года назад

      @@UnfoldDataScience I should be the one to call you Sir lol. Regardless from where you are, your Data Science content is gold. The way you boil down and explain complex concepts in very simple English is really mind blowing. In shaa Allah planning to see all of your videos and extract maximum information from your channel.