190 - Finding the best model between Random Forest & SVM via hyperparameter tuning

Поделиться
HTML-код
  • Опубликовано: 15 янв 2025

Комментарии • 7

  • @evyatarcoco
    @evyatarcoco 4 года назад

    As usual, a great video, which takes the fine-tunning work more comfortable and easy.

  • @surflaweb
    @surflaweb 4 года назад

    Recently I started involved in this channel. Great content. Is really interesting use the conv features to train a svm model.

  • @mridulchaturvedi6636
    @mridulchaturvedi6636 2 года назад

    sir what is cubic and medium svm

  • @venkatesanr9455
    @venkatesanr9455 4 года назад

    Hi Sreeni sir,
    Xgboost hyper parameter tuning with some example image segmentation(in your style) will be helpful.

  • @bijulijin812
    @bijulijin812 3 года назад

    Is it possible to train the model batch wise using grid search ?

  • @sadeghsalehi9103
    @sadeghsalehi9103 2 года назад

    Hello there, thanks for your greats tutorials. I have a question that shouldn't we focus more on CV argument in GridSearchCV in order to use like RepeatedKFold to address the both data-uncertainty and algorithm-uncertainty(in this case, MLP-uncertainty)? like using at least 10Fold-10Times or 30times? I know it takes a really big time and it make us sad so, but isn't it the only way we could really trust the result for best hyperparameters? or as with your experiences is it ok to just use like cv=5or10Fold and not use repeating process(like 10Fold-1Time)?

  • @منةالرحمن
    @منةالرحمن 4 года назад

    thank you best teacher ..