Dropout Regularization | Deep Learning Tutorial 20 (Tensorflow2.0, Keras & Python)

Поделиться
HTML-код
  • Опубликовано: 23 ноя 2024

Комментарии • 76

  • @codebasics
    @codebasics  2 года назад +2

    Check out our premium machine learning course with 2 Industry projects: codebasics.io/courses/machine-learning-for-data-science-beginners-to-advanced

  • @codinghighlightswithsadra7343
    @codinghighlightswithsadra7343 Год назад +7

    I'm at a loss for words to express my gratitude towards you. your tutorial is amazing, thank you so much

  • @shaiksuleman3191
    @shaiksuleman3191 4 года назад +3

    Sky has no limits u r teaching has no more questions.Those who are dislike they are like finding the color of water,run the car with out petrol

  • @clementvirgeniya8000
    @clementvirgeniya8000 4 года назад +18

    Sir, Your way of teaching is awesome.Sir, Please do videos on Multi-class classification problem in deep learning.

    • @rafibasha4145
      @rafibasha4145 2 года назад

      Hi ,can you please let me know how dropout works during testing phase

    • @nitishkeshri2378
      @nitishkeshri2378 2 года назад

      @@rafibasha4145 basically it drops neuron randomly from hidden layers

  • @geekyprogrammer4831
    @geekyprogrammer4831 2 года назад +3

    such a funny analogy at the beginning! You are true genius educator :D :D

  • @karthikc8992
    @karthikc8992 4 года назад +5

    happy to be here again!!!

  • @pa5119
    @pa5119 4 года назад +3

    Sir, your explanation is great great great.
    But, sir please make video on this series fast so, as our exam come near we prepare well and we complete in less time.
    Thanks a lot for making such a good videos.

    • @codebasics
      @codebasics  4 года назад +1

      I will try my best Pa. And thanks for your kind words

    • @bratsummer1980
      @bratsummer1980 4 года назад

      @@codebasics Sir actually my project is on continous data of 5 inputs and output.
      I have used multiple linear regression.
      Also I am using Ann by batch gradient and tensor flow
      But I request you to please upload video on continous output or supervised learning like prediction of salary by tf

  • @manideep1882
    @manideep1882 3 года назад +1

    and here I was waiting dropout regularization to happen for you to delete dense layers #2 and #3 !! hahaha. Great stuff. Keep up the good work.

  • @jongcheulkim7284
    @jongcheulkim7284 3 года назад +4

    I am enjoying your tutorials. Thank you so much.

  • @hardikvegad3508
    @hardikvegad3508 4 года назад +2

    Sir Please cover the concept of EARLY STOPPING... I know the implementation part but want to know in-depth.

  • @very_nice_777
    @very_nice_777 Год назад +2

    Sir, can you explain why dropping neurons to 50% isn't as the same as reducing the neuron size to 50%? For example, instead of taking 60 neurons and dropping 50%, why don't we just take 30 neurons to begin with?
    Thanks in advance

    • @NitinKumar-wm2dg
      @NitinKumar-wm2dg Год назад

      beacause neurons might be biased towards some neurons, with each epoch, it randomly drops neurons and trains the model, then back propagates. This way, we avoid bias in neural networks and we efficiently train our data

    • @r0cketRacoon
      @r0cketRacoon 8 месяцев назад

      @@NitinKumar-wm2dg so all about dropout is that for each epoch, it RANDOMLY drops neurons (which are different from neurons dropped from the previous and following epochs) and train with the remaining

  • @sandiproy330
    @sandiproy330 Год назад

    Thank you. Nicely explained with a clear-cut example.

  • @iaconst4.0
    @iaconst4.0 8 месяцев назад +1

    El accuracy me salio casi igual , sin embargo, agradesco el video

  • @balajiplatinum08
    @balajiplatinum08 3 года назад +2

    Hi,
    In Deep learning, can you Please post some videos on hyper parameter tuning.
    Thanks

  • @devilzwishbone
    @devilzwishbone Год назад

    05:10 so effectively a drop out could be considered similarly then to test/train data, in that it trains neurons A and C, then adjusts B and D based on the test results from A and C

  • @achelias8477
    @achelias8477 2 года назад +1

    Thank you for this amazing tutorial! I even understood the batch size, without this being my goal with this video here!

  • @vishaltanawade7637
    @vishaltanawade7637 3 года назад +3

    9:10 can we replace M and R with 0 and 1 instead of using dummy variable ??

  • @clementvirgeniya8000
    @clementvirgeniya8000 4 года назад +2

    Sir, in my dataset i am having 20 target variables(ie., multi- class problem). When i train and test my accuracy it is only 45%. I am little bit struck with this. It will be helpfull if you give me some suggestions .

  • @MrSHANKSHINE
    @MrSHANKSHINE 3 года назад +1

    Awsome Sir, Thakyou so much for making us understand such important concepts in simple n easy way..!!!

  • @souvikghosh6509
    @souvikghosh6509 3 года назад +1

    Sir..Is it possible to apply dropout in deep autoencoder??

  • @abdansyakura2982
    @abdansyakura2982 2 года назад +1

    HI Sir, I have a question about the droupout technique, as we can see this technique randomly deactivate the neuron, what about the testing, is it still deactivated ?

    • @fakharyarkhan5848
      @fakharyarkhan5848 2 года назад

      They're only deactivated during training. During testing, all of the weights are resealed by p, the probability that a node gets dropped.

  • @jansirani4429
    @jansirani4429 7 месяцев назад

    Very good explanation

  • @mdsifath7741
    @mdsifath7741 4 года назад +1

    sir can you make or suggest any video for ADAM optimizer??@codebasics

    • @codebasics
      @codebasics  4 года назад +3

      yes i will be adding that in this series

  • @ncf2294
    @ncf2294 3 года назад

    thank you for your tutorial. I have learned much from it

  • @fahadreda3060
    @fahadreda3060 4 года назад +2

    Great tutorial, love the biryani example 😂😂

    • @codebasics
      @codebasics  4 года назад +2

      ha ha.. i knew fahad that some biryani lovers are going to like it for sure. Looks like you like biryani correct :)

    • @fahadreda3060
      @fahadreda3060 4 года назад

      @@codebasics Who Doesn't like briyani😂😂??how much time does it take you to make these amazing videos ? I teach data science in Arabic , which is way harder than in English , because some terms doesn't have a proper translation and there is no source for Data Science in ARABIC!! so one webinar will take me 5 days , and a video will take me around 2 days , including video editing, so how much time does it take you to make these videos ? Thanks again !

  • @mohdsyukur1699
    @mohdsyukur1699 7 месяцев назад

    You are the best My Boss

  • @tchintchie
    @tchintchie 4 года назад +1

    Invaluable 👏

  • @ITVishal
    @ITVishal Год назад

    what a lecture omg

  • @karthikb.s.k.4486
    @karthikb.s.k.4486 4 года назад +1

    Sir what is the keyboard that you are using for progrmaming.

  • @bratsummer1980
    @bratsummer1980 4 года назад +1

    Sir one question more batch gradient is also artificial neural network

    • @codebasics
      @codebasics  4 года назад

      batch gradient is just a technique of gradient descent. That can be used in artificial newural net as well as statistical models such as decision tree

  • @jvandeal
    @jvandeal 2 года назад

    This was so good, thank you!

  • @Adinasa2
    @Adinasa2 3 года назад +1

    Looks like you always get confused between input layer dimensions and hidden layer dimensions

  • @bratsummer1980
    @bratsummer1980 4 года назад

    Sir please give example with continuous output regression or multiple classification

  • @sumitchhabra2419
    @sumitchhabra2419 3 года назад +4

    I loved your tutorials Brother. I have just one question from you, In every iteration, we have a new set of data and neurons can be chosen at random right?
    From this, I infer, that neurons will learn different data and will not be biased towards certain data inputs right?

    • @codebasics
      @codebasics  3 года назад +2

      yes that is the correct understanding

  • @spadiyar6725
    @spadiyar6725 3 года назад

    why drop out not used for test and validation data?

  • @asifurrahmankhan5006
    @asifurrahmankhan5006 3 года назад

    Why one hot encoding to convert 'y' into integer? Can't we do that with simpile 0 and 1 conversion?? Can you clear this please?

    • @kmnm9463
      @kmnm9463 2 года назад

      Hi - if you, say LabelEncode the y, the values will be 0 and 1, obviously, but what will happen here is the model will take the values as some kind of numerical order, that is 1 > 0. Here it is not what we want right , we want the model to learn the M and R and not M>R or R>M. That is why one - hot encoding is better option. There is a detailed video on this by Sir - where he has explained very clearly the logic of one-hot encode. You can check this out by YT search on One Hot Encode Codebasics. '
      Thanks.

  • @bratsummer1980
    @bratsummer1980 4 года назад

    We are huge fan of you.....

  • @AKSHAY99552
    @AKSHAY99552 3 года назад

    really great explanation ..

  • @Martyniqo
    @Martyniqo 2 года назад

    Thank You so much!

  • @ashishsasankar1479
    @ashishsasankar1479 Год назад

    getting message accuracy not defined

  • @AquarianVikas
    @AquarianVikas 2 года назад +2

    Hi, when I ran the classification report after adding the dropout layers, I got slightly lesser accuracy and F1 scores. Is this normal? or it could be that I must have made some mistake?

    • @PhaniHarshithKotturu
      @PhaniHarshithKotturu Год назад

      It is correct as many times adding the dropout layer decreases the accuracy but it is fine as it appropriately generalizes your model so that it performs well in all conditions.

    • @r0cketRacoon
      @r0cketRacoon 8 месяцев назад

      high accuracy is not always good, it also means ur model might deal with overfitting

  • @AlonAvramson
    @AlonAvramson 3 года назад

    Thank you!

  • @faezeabdolinejad731
    @faezeabdolinejad731 3 года назад

    its awesome , thanks

  • @rishavbhattacharjee7182
    @rishavbhattacharjee7182 4 года назад +1

    Sir I can't download the csv file

  • @osamashawky622
    @osamashawky622 3 года назад

    very good

  • @shaiksuleman3191
    @shaiksuleman3191 4 года назад

    Sir,can u plz make videos on pytorch

    • @codebasics
      @codebasics  4 года назад

      sure it is in my todo. but first let me finish this series in tensorflow. I am also working on data structures series that I need to finish as well

    • @shaiksuleman3191
      @shaiksuleman3191 4 года назад

      @@codebasics sorry sir by mistake it is pycaret

  • @wbh786
    @wbh786 2 года назад

    love u sir

  • @dataflex4440
    @dataflex4440 2 года назад

    Tea set or t-shirt 😂