193 - What is XGBoost and is it really better than Random Forest and Deep Learning?

Поделиться
HTML-код
  • Опубликовано: 4 июл 2024
  • Code generated in the video can be downloaded from here:
    github.com/bnsreenu/python_fo...
    Dataset used in the video: archive.ics.uci.edu/ml/datase...)
    XGBoost documentation:
    xgboost.readthedocs.io/en/lat...
    Video by original author: • Kaggle Winning Solutio...
  • НаукаНаука

Комментарии • 59

  • @pvrnaaz
    @pvrnaaz 2 года назад +7

    Very organized and clear with excellent examples that make it so easy to understand. Thank you!

  • @caiyu538
    @caiyu538 2 года назад +1

    Perfect tutorial, I am using XGBoost and random forest to analyze some work. Perfect tutorials for me. Always appreciate your continuous efforts to share your knowledge through youtube.

  • @sudippandit6676
    @sudippandit6676 3 года назад +1

    Very organized and straight forward! Waiting other videos. Thank you for sharing this knowledge.

  • @venkatesanr9455
    @venkatesanr9455 3 года назад

    Thanks, Sreeni sir for your valuable and knowledgeable content. Also, waiting for the next semantic segmentation series and also discusses the hyperparameters and their tuning, Time series analysis that will be highly helpful.

  • @evazhong4419
    @evazhong4419 2 года назад

    your explanation is so interesting haha, it helps me a lot understand the material

  • @ashift32
    @ashift32 2 года назад +2

    Very well explained, clear and concise. Thanks for taking your time

  • @VarunKumar-pz5si
    @VarunKumar-pz5si 3 года назад +1

    Awesome Tutorial, Glad I got a great teacher..Thank you...

  • @mhh5002
    @mhh5002 2 года назад +3

    Very well explained, sir. It was intuitive for beginners. The analogies are interesting as well.

  • @andyn6053
    @andyn6053 9 месяцев назад +1

    This was very clear and useful! Do u have any link to your code? Also, could xgboost be used for linear regression aswell?

  • @Ahmetkumas
    @Ahmetkumas 3 года назад +2

    Thanks for the video and effort. Can you make a time series video using xgboost, or something with multiple features(lags, rolling mean, etc..)

  • @omeremhan
    @omeremhan Год назад

    Magnificant!!! Thanks for clear explanation Sir.

  • @riti_joshi
    @riti_joshi 2 месяца назад

    I never comment on any RUclips videos, but I am compelled to do here, because I learned most of my analyses for my dissertation following your tutorials. You're such a great tutor. Thank you so much.

  • @tannyakumar284
    @tannyakumar284 2 года назад

    Hi. I have a 1500x11 dataset and I am trying to see which out of cognitive ability, non-cognitive ability, family structure, parental involvement, and school characteristics predict academic performance (measured in terms of grades ranging from 1-5). Should I be using XG Boost for this problem or random forest? Thanks!

  • @rezaniazi4352
    @rezaniazi4352 2 года назад +1

    thanks for the video
    what we have to change if we want to use XGBRegressor() insted if classifier ?
    xgboost documantation is so confusing !

  • @semon00
    @semon00 3 месяца назад

    Wow your explanotion is awesome!!!
    Dont stop plz

  • @evyatarcoco
    @evyatarcoco 3 года назад

    A very useful episode, thanks sir

  • @sbaet
    @sbaet 3 года назад +2

    can you make a quick video on normalization and standardizetion for a image dataset

  • @drforest
    @drforest Год назад

    Awesome comparison. Super thanks

  • @grantsmith3653
    @grantsmith3653 Год назад +2

    Sreeni said we need to normalize, but I always thought we didn't need to do that with trees... Am I confused on something?
    Thanks for the video!!

  • @farhaddavaripour4619
    @farhaddavaripour4619 2 года назад

    Thanks for the video. Something I noticed in the figure above that you might have missed is that in the figure you show the most evolved species has lighter hair than less evolved which could interpret a false impression that species with lighter hair are more evolved. It would be great if you could adjust the figure.

  • @vikramsandu6054
    @vikramsandu6054 3 года назад

    Well explained. Thank you so much for the video.

  • @SP-cg9fu
    @SP-cg9fu Год назад

    very useful video ! Thank you!

  • @RealThrillMedia
    @RealThrillMedia Год назад

    Very helpful thank you!

  • @mouraleog
    @mouraleog Год назад

    Awesome video, thank you ! Greetings from Brazil!

  • @sathishchetla3986
    @sathishchetla3986 10 месяцев назад

    Thank you so much for your explanation sir

  • @barrelroller8650
    @barrelroller8650 Год назад +1

    It's not clear where did you get a dataset in a CSV format - the .zip archive from provided link includes only `wdbc.data` and `wdbc.names` files

  • @kakaliroy4747
    @kakaliroy4747 2 года назад +1

    The example of bagging is so funny and I can fully relate

  • @Bwaaz
    @Bwaaz Год назад

    very clear thanks :)

  • @abderrahmaneherbadji5478
    @abderrahmaneherbadji5478 3 года назад

    thank you very much

  • @multiversityx
    @multiversityx Год назад

    What’s the method name? When are you presenting at NeurIPS? (I’ll be attending it :)

  • @axe863
    @axe863 7 месяцев назад +1

    XGBoost with Regularized Rotations and Synthetic Feature Construction can approximate Deep NN deepness

  • @longtruong9935
    @longtruong9935 2 года назад

    dataset in the UCI link not avaiable now. could any one can provide update link?

  • @darioooc
    @darioooc Год назад

    Great!

  • @Lodeken
    @Lodeken 11 месяцев назад

    Wow that analogy! 😂 Amazingly apt lol!

  • @khairulfahim
    @khairulfahim Год назад

    Where can I get the exact .csv file?

  • @kangajohn
    @kangajohn 3 года назад

    if your explanations were a kaggle competition it would be top 1%

  • @ghafiqe
    @ghafiqe Год назад

    Perfect

  • @ahmedraafat8769
    @ahmedraafat8769 2 года назад +1

    The dataset has been removed from the website. is it possible to upload it?

    • @DigitalSreeni
      @DigitalSreeni  2 года назад +2

      Just google search for the keywords and you'll find it somewhere, may be on Kaggle. I do not own the data so I cannot share it, legally.

  • @v1hana350
    @v1hana350 2 года назад

    I have a question about the Xgboost algorithm. The question is how parallelization works in the Xgboost algorithm and explain me with an example.

  • @kangxinwang3886
    @kangxinwang3886 3 года назад +1

    loved the arrange marriage example! Made it very intuitive and easy to understand. Thank you!

  • @vzinko
    @vzinko Год назад +13

    Another case of data leakage. You can't scale X and then split it into test and train. The scaling needs to happen after the split.

    • @Beowulf245
      @Beowulf245 10 месяцев назад +1

      Thank you. At least someone understands.

    • @andyn6053
      @andyn6053 9 месяцев назад

      So this video is incorrect?

  • @Frittenfinger
    @Frittenfinger 8 месяцев назад

    Nice T-Shirt 😃

  • @ramakrishnabhupathi4995
    @ramakrishnabhupathi4995 2 года назад

    Good one

  • @andromeda1534
    @andromeda1534 3 года назад

    Looks like when you demo-ed random forest, you didn't comment out the xgb line, so you actually showed the fitting for xgb twice with the same results.

  • @3DComputing
    @3DComputing 2 года назад

    You're worth more money

  • @alejandrovillalobos1678
    @alejandrovillalobos1678 3 года назад +1

    can you talk about transformers please?

  • @user.................
    @user................. 18 дней назад

    bro trying to share about life n forgot wts hes teaching 🤣🤣🤣🤣
    only were i gt complete idea about xgboost tq

  • @agsantiago22
    @agsantiago22 2 года назад

    Merci !