Scikit-learn Crash Course - Machine Learning Library for Python

Поделиться
HTML-код
  • Опубликовано: 1 окт 2024

Комментарии • 227

  • @freecodecamp
    @freecodecamp  3 года назад +89

    Message from the creator:
    I hope you've all enjoyed this series of videos. It was fun to collaborate with freeCodeCamp!
    If you're interested in more content from me feel free to check out calmcode. Also, I'd like to give a shoutout to my employer, Rasa! We're using scikit-learn (and a whole bunch of other tools) to build open-source chatbot technology for python. If that sounds interesting, definitely check out rasa.com/docs/rasa/.

    • @jadkylan7774
      @jadkylan7774 3 года назад +2

      i guess I'm kinda randomly asking but do anybody know of a good place to watch newly released tv shows online ?

    • @ariesulises1611
      @ariesulises1611 3 года назад

      @Jad Kylan Try flixzone. Just search on google for it =)

    • @brodyodin141
      @brodyodin141 3 года назад

      @Aries Ulises definitely, I've been using flixzone for months myself =)

    • @jadkylan7774
      @jadkylan7774 3 года назад

      @Aries Ulises thanks, I went there and it seems like a nice service :) I really appreciate it!!

    • @ariesulises1611
      @ariesulises1611 3 года назад

      @Jad Kylan happy to help =)

  • @MrCrunsh
    @MrCrunsh 3 года назад +85

    Im busy for the next 2h.

  • @buraksenel263
    @buraksenel263 3 года назад +80

    This is by far the most beginner friendly introduction to sk-learn I've seen

  • @riccello
    @riccello 3 года назад +63

    This is the way everything should be taught!
    I love that you present concepts in a structured and systematic way, speaking slowly and clearly, using as few words as possible...
    - starting with the concept and talking through drawing a logical diagram (which is so important for developing abstract thinking in terms of high level concepts, which is how we think when we are experienced in something).
    - then writing clean, concise code to implement each part of the concept
    - showing plots that directly demonstrate the effects of the entire iteration
    Too many tutorials make the mistake of talking too much. A lot of videos also either assume too much or too little about the viewer's knowledge.
    This seems to confidently stike the nail on the head!
    Thanks!

  • @dariuszspiewak5624
    @dariuszspiewak5624 2 года назад +23

    I must agree with others: this is a great lecture. I mean... REALLY good. Vincent, do you have any more of these? This stuff is not only informative, but also pleasant to watch and listen to. Good, correct, and clear English is rather rare these days. Sadly. This lecture is good because it does not shy away from details. It also goes beyond just showing the API. It tries to build something new from the available "Lego" pieces. Which is great as it shows creativity and also how to dig deeper to understand the data. Very, very good exposition. Many thanks.

    • @tyronefrielinghaus3467
      @tyronefrielinghaus3467 10 месяцев назад

      I feel you about clear and well enunciated English. I HATE having to 'interpret' what I'm hearing....too much extraneous Cognitive Load for an already high Intrinsic Load topic.

  • @gabriel1991
    @gabriel1991 3 года назад +29

    OMG! I love all the contente that Vincent makes! I must watch this video!

  • @thecaptain2000
    @thecaptain2000 8 месяцев назад +2

    It is a delicate subject, but I think the question of the Algorithm being racist is an ill advised one. The real question under it is whether The % of black population parameter affects the house price or not. Is the aim of a data scientist to make the actual prediction or to make the data fit a point of view (which, btw, I totally endorse in principle)

  • @sonalkudva1839
    @sonalkudva1839 8 месяцев назад +4

    i am trying to learn from this course but it says that the boston data set has been removed from scikit learn. what should i do?

    • @juaningo24
      @juaningo24 4 месяца назад

      You can still downgrade your scikit-learn version to 1.0.2 and it should be fine, also if you don't want to, you can use the fetch_california_housing instead

  • @flashbao1922
    @flashbao1922 3 года назад +17

    This video saved me from a 5K course! Thanks! Loads of Love!

  • @rajveersinghanand
    @rajveersinghanand 3 года назад +17

    16:00 pipe
    23:45 grid search
    37:00 standard scaler
    42:00 quantiles better
    46:55
    55:00 fraud ex

  • @codesiddhi
    @codesiddhi 3 года назад +13

    Just Amazing once again, u guys rock as always...

  • @navneetTanks
    @navneetTanks 3 года назад +8

    Thankyou very much, much needed for beginners like me❤️,
    I hope one day when I'll become expert, I will make free courses for others too❤️

  • @AcidiFy574
    @AcidiFy574 3 года назад +15

    Awesome Tutorial,
    I have some suggestions regarding your content:
    1. Tutorial on RUST
    2. Tutorial on JULIA
    3. Tutorial on AWK & SED
    (Especially AWK)
    4. Tutorial on LUA
    What do you guys think????

  • @louisshengliu
    @louisshengliu 2 года назад +4

    Could you please explain why the min of recall and precision is lower than both? Could not find appendix.

    • @adrienpyb1611
      @adrienpyb1611 2 года назад

      +1, anyone knows where to find the appendix?

    • @ANONIM9123
      @ANONIM9123 2 года назад +1

      hint: min_both is calculated separately at every train/test split in the cross-validation

    • @GaneshGaiy
      @GaneshGaiy 7 месяцев назад +1

      +1, same, could not find appendix

  • @lVaNeSsA90
    @lVaNeSsA90 3 года назад +13

    Wow - I need to share this with the rest of the class! Thanks for making this video so understandable.

  • @cerioscha
    @cerioscha 10 месяцев назад +4

    great video series, thanks ! In this video @56:56 i think you meant to say that "there are way more cases without Fraud than with Fraud"

    • @victoran0
      @victoran0 9 месяцев назад +3

      exactly why i came to the comments

  • @ThomasKuncewicz
    @ThomasKuncewicz Год назад +7

    The way each dataset complements the associated pitfall you want to bring up at a given moment... wow. What an amazing intro -- it must have taken a lot of forethought and behind the scenes organization to make the flow of this video series seem so effortless. THANK YOU!!

    • @wws9999
      @wws9999 11 месяцев назад

      please bro can you tell me where to find appending for the plot answer ?

  • @Natalie-rl5wz
    @Natalie-rl5wz 6 месяцев назад +4

    Hello, I just wanted to say for those who plan to do the videos. The data set 'Boston house prices' has been removed by scikit, therefore this tutorial is not really working anymore unless you change the dataset

  • @hassanhijazi4757
    @hassanhijazi4757 Год назад +1

    I did not succeed to reproduce the figure @ 1:16:56. I'm always getting the same figure as the one just before even I did the log transformation of the "Amount" column. Anyone have had the same problem?

  • @wws9999
    @wws9999 11 месяцев назад +1

    please guys, where is this appending for the plot answer ????????????????

    • @vignatej663
      @vignatej663 7 месяцев назад

      Bro, did you got any???

  • @ultraviolenc3
    @ultraviolenc3 3 года назад +2

    1:11:00 what’s the answer though?

  • @thomasnissen6695
    @thomasnissen6695 Год назад +2

    Did anybody figure out why the mean of the min(recall, precision) was below the actual mean of both recall & precision? 1:10:57

    • @meisterpianist
      @meisterpianist 8 месяцев назад +1

      The mean is always measured over all 10 splits, for precision, for recall AND for the minimum separately. In other words, FIRST the minimum is calculated, THEN the mean over all these minimums is calculated. If you would have only one split, there would not be a problem. But starting with two splits, we have: test_precision 1.0 and 0.46 = mean 0.73. test_recall 0.37 and 1.0 = mean 0.68. However, the minimum is 0.37 and 0.46, and if you calculate the mean of these two, it's 0.42, which is below 0.73 and below 0.68. So it's reasonable that the minimum is always a bit lower than each of the two lines. In fact, I never found the "appendix", Vincent was talking about. I just took the grid-results as a dataframe, exported it to excel and played a bit around.

    • @GaneshGaiy
      @GaneshGaiy 7 месяцев назад

      @@meisterpianist Thanks for the explanation!

  • @fishnchips6627
    @fishnchips6627 2 года назад +1

    35:56 as a non-American, it is so satisfying hearing z read as 'zed' not 'zi'. lol

  • @tanb13
    @tanb13 3 года назад +5

    Does Vincent has his own Channel, I just love his teaching style!!

  • @imdadood5705
    @imdadood5705 3 года назад +4

    Just completed the first part of the lecture. I have been using scikit for a couple of months! Dudeee! This is an eye opener!

  • @kevindandrade5307
    @kevindandrade5307 3 года назад +3

    The section on Metrics gets confusing for me. Any easy to understand books I can read for understanding metrics?

    • @saptarshisanyal4869
      @saptarshisanyal4869 2 года назад

      The metrics section was overwhelming for me as well. There has to be a pre requisite base work before going for this.

  • @AlmogYosef520
    @AlmogYosef520 3 года назад +3

    Hi, what do you guys suggest me to watch if I'm totally new to ML?
    I find this course a little bit beyond my knowledge, I thought because I've got the foundation of DS I can jump on this course but I think I'll need some intro to ML videos.

  • @jakobaljaz705
    @jakobaljaz705 Год назад +2

    i feel i learned so much, great job sir. Thank you :)

  • @abhijeetkushwaha424
    @abhijeetkushwaha424 3 года назад +5

    Do you guys like..read minds or something?
    I was working on a django project yesterday, and you released one. I was stuck on ML today, and here's the video. Wicked!

  • @JoshJetson
    @JoshJetson Год назад +3

    This is an excellent tutorial. Im doing the coursera ibm maachine learning cert and supplementing it with this video. This overall is a much more palatable and easier to understand tutorial of scikit learn and really a machine learning model in general. Awesome work!

  • @nguyenphutho9503
    @nguyenphutho9503 3 года назад +1

    Sorry, I have a question :
    Which version of python and opencv are matched ?
    Because a lot of tutorials I had follow, but unable to find matched compatible version of python and opencv.
    Please help me to find solution to my own project. Thank you so much.

  • @rajatsharma6137
    @rajatsharma6137 3 года назад +1

    sorry...but i totally lost it from metrics onwards...it was too heavy to understand...did not understand even the purpose of the lecture let alone the code...

  • @rodrigo100kk
    @rodrigo100kk 3 года назад +3

    Great video ! At 1:49:40 you could use ".values" at the end instead of np.array in the beginning.

  • @riccello
    @riccello 3 года назад +1

    Can I ask you how you are able to draw on the screen? I understand you are probably using a Stylus pen over some touch screen surface, which mirrors your display, but what software are you using for that?

  • @DuncanPenny-v7q
    @DuncanPenny-v7q 11 дней назад

    Thompson Linda White Charles Martinez Margaret

  • @develxper7931
    @develxper7931 2 года назад +2

    I was rewatching the course to make my basics better , there were actually a lot of details man!!!

  • @ЭльмарИдрисов-г5э
    @ЭльмарИдрисов-г5э 3 года назад +1

    Could you please do "Python for Raspberry Pi 4". I cannot fight a proper guide which properly introduces and explains from the very beginning. I would like to experiment with robotics (e.g. robot arm, etc.), but have no idea how to start programming it. All available guides are using irrelevant projects to start with Raspberry.
    Note: Thank you for the tutorial!

    • @mwanikimwaniki6801
      @mwanikimwaniki6801 3 года назад

      I could help with a little info if you are still interested,

  • @rodiekozlovsky2415
    @rodiekozlovsky2415 3 года назад +2

    what a great course! thank you for openning the gates..

  • @messedinsaan
    @messedinsaan 2 месяца назад

    "Building dependencies failed"
    error: subprocess-exited-with-error
    Cannot import boston housing price dataset.

  • @DuncanPenny-v7q
    @DuncanPenny-v7q 11 дней назад

    Allen Scott Hall Shirley Brown Carol

  • @olhaklishchuk
    @olhaklishchuk Год назад

    I have one question on time of lapsing GridSearchCV pipeline: how to minimize time of running code, because my model was estimated with mean fit time at least 9 min. My processor is AMD Ryzen 5 5500U with Radeon Graphics 2.10 GHz and 6 cores. Thenk you in advance!

  • @locky916
    @locky916 7 месяцев назад

    Thanks for this great material about scikit-learn, it is really helpful and understanding is more comfortable with educators beatiful explanations. Huge thanks and keep going...

  • @dilshanchrishantha6548
    @dilshanchrishantha6548 3 года назад +1

    excellent explanation for a beginner in ML .Thanks for the course.

  • @crashingatom6755
    @crashingatom6755 8 месяцев назад

    How did the entirety of setting up and getting Jupyter Notebooks to function...just get skipped? Everything beyond that is useless because JN is the worst software in history.

  • @gustavojuantorena
    @gustavojuantorena 3 года назад +1

    Awesome! Thank you for sharing!

  • @muhammadsahalsaiyed2595
    @muhammadsahalsaiyed2595 Месяц назад

    Boston House Price Dataset is available on Kaggle for those who are saying scikit learn has removed it.

  • @pw7225
    @pw7225 2 года назад +1

    Kudos! Excellent training.

  • @juanete69
    @juanete69 2 года назад

    Is GridSearchCV(... ,cv=3) doing a nested crossvalidation?

  • @iyar220
    @iyar220 Год назад

    What if I want to use pycharm instead of Jupyer Notebook? Would I still be able to follow this course, or am I better off looking for another one?
    (There's this other course on this channel, but this one has a better audio quality and overall seems more pleasent to follow, so I'm not sure ruclips.net/video/pqNCD_5r0IU/видео.html)
    Note: I understand that python is used in both places, but I don't know how much of an effect using a different platform would make on the learning experience

  • @2traquinas
    @2traquinas Год назад

    Do someone have the credit card fraud .csv similar to the teacher? Because the sheet that I got on Kaggle I can't convert it directly to dataframe (yes, I tried to do some pretreatment on file but in the last row, if I sum up every thing, its returns 0)

  • @LucasBiunessa
    @LucasBiunessa 4 месяца назад

    Impossivel prosseguir com este curso por conta do problema etico com este dataset

  • @vadimrudakov8907
    @vadimrudakov8907 Год назад

    Data leakage? In the introducing section (like in 28:41) we have a gridsearch that contains a pipeline with the numeric features transformer. I guess it is the right way to data leakage, because in our pipeline we first transform all the numeric features in the entire dataset and straightly after that we start our model learning through the cross-validation process within the entirely transformed dataset. Our training sets, created during cv, contain previously standardized data, so the model "knows" something about the examples that are not in the training set and can predict better when process them in the prediction step. Thus we should exclude any numeric features transformation in our grid search, am I right? If I'm not, please explain the mechanism.

  • @markomilenkovic2714
    @markomilenkovic2714 Год назад

    Is it still worth watching this video? How much has changed in 2 years? Thank you

  • @Eizengoldt
    @Eizengoldt 11 месяцев назад +1

    i hate python so much, just errors after errors

    • @NotEnoughTime-cf2pi
      @NotEnoughTime-cf2pi 6 месяцев назад

      I actually agree with you. I am having a hard time switching from R using Caret. Good Luck

  • @wiktorm9858
    @wiktorm9858 11 месяцев назад

    Rime series needed these Polynomial parameters, i think. Cool tutorial though!

  • @anandsrikumar007
    @anandsrikumar007 3 года назад

    If i get a high paying job, i will donate at least 5000 rupees to freecodecamp

  • @JoseRicardoXavier
    @JoseRicardoXavier 3 года назад +1

    Amazing presentation !!

  • @5tr0mx
    @5tr0mx 3 года назад

    25:50 using space instead of tab .... stops watching :) (joke) great video

  • @Treegrower
    @Treegrower 5 месяцев назад

    This video is awesome! Your narration style is fantastic.

  • @linkified220
    @linkified220 Год назад

    Is it just me or is it everyone who thinks that everyone says every language and library is extremely popular and is the main aspect when it comes to building the best things in the world

  • @espirikii
    @espirikii 2 года назад

    For the Titanic example: 76% of the women survived, whereas just 16% of the men survived, that would have been a really good classifier to start with

  • @Zoro-VXY
    @Zoro-VXY 25 дней назад

    vincent chansard

  • @VASUofficial0
    @VASUofficial0 3 месяца назад

    for better learning you can also provide data links used in this course ,sir if u can

  • @Rukoilla
    @Rukoilla 3 месяца назад

    B for blacks is wild.

  • @khal7994
    @khal7994 2 года назад

    00:19 i did not underestand why after changing k value from 5 to 1 prediction diagram changed ? knn is a classification algoithm but here it was like a regration

  • @tilakrajchoubey5534
    @tilakrajchoubey5534 2 года назад

    Does this video contains something about ML algorithms?

  • @Thermonator621
    @Thermonator621 Год назад

    I am not getting this chart at this point in time ruclips.net/video/0B5eIE_1vpU/видео.html
    I get something more like the original but the dotted line is between 10 and 12.5 class weight.

  • @cientifiko
    @cientifiko Год назад

    very useful... I run the code on idle but it didnt work well, there are something that need to revise like importation of library being after used variable.

  • @shajidmughal3386
    @shajidmughal3386 Год назад

    So far into the video, I don't see the data split into train and test samples. Does that mean the model is testing on seen data? If yes, how reliable are these metrics?
    Someone shed some light, please.

  • @asokt4931
    @asokt4931 Год назад

    What do you mean watch all these videos? Are there different videos series?

  • @8.O.8.
    @8.O.8. Год назад

    i was wondering why i got the huge red warning when running load_boston data, that's ridiculous how that 30:40 is real

  • @xnalebb
    @xnalebb 6 месяцев назад

    At the metrics part, when you plot mean recall and mean precision, how is it that i got the same results for the train and test sets?

  • @albertog2196
    @albertog2196 3 года назад +1

    Very good teacher. Thanks for the content I learned a lot.

  • @mugumyavicent2803
    @mugumyavicent2803 2 года назад

    thanks my co name --- vicent, you inspire me to do machine learning

  • @ninadkawade4681
    @ninadkawade4681 2 месяца назад

    what will be the prerequisite for scikit learn ??

  • @royalmaddy0135
    @royalmaddy0135 3 года назад

    6:07

  • @abdougadrydiallo1318
    @abdougadrydiallo1318 3 года назад +1

    Where can we find the dataset ?

  • @ToanTran-nt7jg
    @ToanTran-nt7jg 2 года назад

    link code github is 404, fix it plz.

  • @parzynamea4701
    @parzynamea4701 3 года назад

    where is that make_plots function from, at 1:31:00

  • @abdelkaderkaouane1944
    @abdelkaderkaouane1944 Год назад

    Very interesting, Thank you very much

  • @ShiftKoncepts
    @ShiftKoncepts 11 месяцев назад

    thank you so much! I am slowly digesting this stuff and most likely will have to review it 2 or more times.

  • @nikumoniborah
    @nikumoniborah 2 года назад

    rr

  • @SK-qj3oj
    @SK-qj3oj 5 месяцев назад

    Wow such an awesome course, cant believe this is free

  • @memelol1859
    @memelol1859 2 года назад

    Wow thank u this really clarified my doubts :)

  • @JoshKonoff1
    @JoshKonoff1 2 года назад

    Where are the datasets for the sklearn metric tutorial (credit card dataset, etc)? Thank you!

  • @mwaikul
    @mwaikul 3 года назад

    Is there a way KNN to skip the closest nearest neighbor?

  • @berdeter
    @berdeter Год назад

    I loved the end chapter that joined machine learning with expert systems I've used 30 years ago...

  • @dilshanchrishantha6548
    @dilshanchrishantha6548 3 года назад +1

    great series of demo videos. well explained for a beginner to learn from zero.

  • @_seeker423
    @_seeker423 8 месяцев назад

    @43:00 where you perform the QuantileTransformer step and plot it...shouldn't the scatter plot fn take X (non transformed) and X_new (transformed) data as params? Little confused why we passed X_new[:, 0] X_new[:, 1]. It seems like we plotted 2 different features (indexed by 0, 1) after transformation step?

    • @vignatej663
      @vignatej663 7 месяцев назад

      No, it is actually syntax of pandas,
      X[l1=[list...], l2=[list....]] => choose all rows in l1 and all columns in l2.
      so, X_new[:, 0] chooses all rows with col 0, X_new[:, 1] chooses all rows with col 1.
      Hope this helps

  • @yugosaito9704
    @yugosaito9704 Год назад

    Thank you for uploading this video!

  • @rouzbehamirazodi3001
    @rouzbehamirazodi3001 9 месяцев назад

    Well explained and high quality video and audio. Unlike some other videos out there.

  • @srichaidiamond1032
    @srichaidiamond1032 2 года назад

    Hello,
    I run into an attribute error exception when i try to run the .cv_results_ on my model:
    'GridSearchCV' object has no attribute 'cv_results_'
    df = pd.DataFrame(mod1.cv_results_) #is the line of code. where mod1 is my model.
    Does anyone know if there is a bug? i am using 1.1.1 versio of scikit learn

    • @bonettimauricio
      @bonettimauricio 2 года назад

      I'm having the very same error here as well, I have installed the specific version scikit-learn=0.23.0.

  • @feep1642
    @feep1642 3 года назад +1

    very nice tutorial watched the whole thing

    • @arnavmehta3669
      @arnavmehta3669 3 года назад

      How you watched 2 hr video in 27minutes

  • @vigneshpadmanabhan
    @vigneshpadmanabhan 3 года назад +1

    Thanks!

    • @vigneshpadmanabhan
      @vigneshpadmanabhan 3 года назад

      this is one of the best videos I have seen covering sklean so well. Thanks a lot! would love to learn sklearn in more depth for different scenarios ..

    • @saptarshisanyal4869
      @saptarshisanyal4869 2 года назад

      Hi Vignesh, could you suggest a book which covers the metrics section?

  • @StarsTogether
    @StarsTogether 11 месяцев назад

    This is compelling writing. If the subject fascinates you, a subsequent book with similar themes would be beneficial. "From Bytes to Consciousness: A Comprehensive Guide to Artificial Intelligence" by Stuart Mills

    • @wws9999
      @wws9999 11 месяцев назад

      please bro can you tell me where to find appending for the plot answer ?

  • @KumR
    @KumR 9 месяцев назад

    1

  • @johnmo1111
    @johnmo1111 Год назад

    Great video. Helped me with multiple sections that I had been fumbling my way through. No hard going over some things I already knew aswell.
    Thanks for this..👍

  • @ginopeduto4264
    @ginopeduto4264 Месяц назад

    so well explained thank you

  • @kodiaktheband
    @kodiaktheband Год назад +2

    The Boston housing prices dataset has an ethical problem: as
    investigated in [1], the authors of this dataset engineered a
    non-invertible variable "B" assuming that racial self-segregation had a
    positive impact on house prices [2]. Furthermore the goal of the
    research that led to the creation of this dataset was to study the
    impact of air quality but it did not give adequate demonstration of the
    validity of this assumption.
    The scikit-learn maintainers therefore strongly discourage the use of
    this dataset unless the purpose of the code is to study and educate
    about ethical issues in data science and machine learning.

  • @Duh_Daily
    @Duh_Daily Год назад

    the explanations are well detailed, this really helps with understanding the library and know exactly what to use and where to use it. You have helped a great community of beginners. 🙏🏾🙏🏾🙏🏾🙏🏾🙏🏾