Boost XGBoost Performance: Easy Hyperparameter Optimization (Code)
HTML-код
- Опубликовано: 28 сен 2024
- XGBoost (Extreme Gradient Boosting) is a powerful and efficient machine learning algorithm that is based on the principles of gradient boosting. It is widely used for classification and regression tasks due to its high performance and accuracy.
Tuning XGBoost involves adjusting its hyperparameters to optimize its performance on a specific dataset. When tuning these hyperparameters, it's common to use techniques like grid search or randomized search. Cross-validation is essential to evaluate the model's performance for different parameter combinations. In this video, we look at a new optimal approach to tune the XGboost classifier.
Explore the world of data science and machine learning with practical insights geared towards software roles on my channel. I'm Sandeep Singh Sandha, with a bachelor's from IIT Roorkee and a master's and Ph.D. in computer science from UCLA. With hands-on experience at Oracle, IBM, Arm, Amazon, Teradata, and Abacus, I bring practical expertise to the table.
Join me on this channel to dive into crucial topics through working projects, as my goal is to share practical knowledge and make learning enjoyable. For more details about my work, visit: sites.google.c...
Amazing piece of work with Mango
Code: colab.research.google.com/github/ARM-software/mango/blob/main/examples/Xgboost_XGBClassifier.ipynb
Waiting for more videos
👍
Thanks 🙏
I have been searching for this
Xgboost is complex to tune. This looks promising
Thanks for this. xgboost is so hard to tune manually