Lab 8 Decision Tree, Random Forest, Gradient Boosted Trees, and Cross Validation in

Поделиться
HTML-код
  • Опубликовано: 14 окт 2020
  • Objectives:
    • To learn how to classify data with tree models.
    • To learn how to assess model performance with cross-validation.
    Data:
    • Download the house price label excel file from github.com/xbwei/machine_lear...
    Steps:
    1. Load the house_price_label data into RapidMiner.
    2. Build a set of tree models to classify house types based on house ages and prices. Use Cross-Validation to assess the performance (including accuracy and kappa) of each model:
    2.1. A decision tree model with a maximal depth of 5
    2.2. A decision tree model with a maximal depth of 10
    2.3. A random forest model with 100 trees and maximal depth of 10
    2.4. A Gradient Boosted trees model with 20 trees and maximal depth of 5
    3. Bring more features, such as area, number of bathrooms and bedrooms, and lot size, to the tree models to classify the house type. See how the models’ performance will change.
    More about machine learning: • Machine Learning Lab

Комментарии • 6

  • @mimi7273
    @mimi7273 8 месяцев назад

    Thank you. Awesome videos.
    They helped me a lot

  • @tunanhh002
    @tunanhh002 2 года назад

    it's useful! Thanks so much.

  • @sohampavaskar4991
    @sohampavaskar4991 2 года назад

    Awesome video bro

  • @bakhtyarali6979
    @bakhtyarali6979 2 года назад

    It's great. Can I Random forest by 19 criteria and one row?

    • @LBSocial
      @LBSocial  2 года назад

      I think that would be very unlikely, as machine learning models normally require a large amount of data, i.e., more rows.