XGBoost, Random Forest, Logistic Regression Models Training & Feature Importance | PD Model Dev - 3

Поделиться
HTML-код
  • Опубликовано: 16 окт 2024

Комментарии • 4

  • @Seftehandle
    @Seftehandle 7 месяцев назад

    What other resources beside your videos would you recommend, for someone basic to get started? I watch your videos and they are great, however may need some more introductory. Thanks,

  • @PrsnaSmiles
    @PrsnaSmiles 9 месяцев назад

    would you use scaled data to train a tree based model???

    • @learnerea
      @learnerea  9 месяцев назад

      You are right. However,
      while scaling is not strictly necessary for tree-based models, it can enhance consistency, aid in feature importance interpretation, and potentially influence regularization in the case of XGBoost. Consider the specific requirements of your project and whether these benefits are relevant in your context.

  • @Seftehandle
    @Seftehandle 7 месяцев назад

    XGBClassifier(base_score=None, booster=None, callbacks=None,
    colsample_bylevel=None, colsample_bynode=None,
    colsample_bytree=None, device=None, early_stopping_rounds=None,
    enable_categorical=False, eval_metric=None, feature_types=None,
    gamma=None, grow_policy=None, importance_type=None,
    interaction_constraints=None, learning_rate=None, max_bin=None,
    max_cat_threshold=None, max_cat_to_onehot=None,
    max_delta_step=None, max_depth=None, max_leaves=None,
    min_child_weight=None, missing=nan, monotone_constraints=None,
    multi_strategy=None, n_estimators=None, n_jobs=None,
    num_parallel_tree=None, random_state=None, ...) any reason why XGB has this problem