Loss Functions : Data Science Basics

Поделиться
HTML-код
  • Опубликовано: 24 дек 2024

Комментарии • 70

  • @tech-n-data
    @tech-n-data 8 месяцев назад +5

    This is such an undervalued channel. You are out here making data scientists out of everyday people. Thank you for your content.

  • @cleansquirrel2084
    @cleansquirrel2084 4 года назад +5

    I'm a commerce student. I started watching your videos because of time series analysis of stock prices.
    Your data science are still so easy for me to understand.

  • @noahrubin375
    @noahrubin375 3 года назад +24

    This brought it all together so nicely. You're a great teacher

  • @shilpitiwari3536
    @shilpitiwari3536 3 года назад

    Deeply appreciate the importance of choosing loss function for different models. As a beginner in data science it makes a lot of sense and better understanding of concepts.

  • @nikunjgattani999
    @nikunjgattani999 4 года назад

    This is the best explanation of Loss function among all blogs, videos I came across

  • @edmundoguerramendoza7465
    @edmundoguerramendoza7465 3 года назад +9

    I really love the way you summarize a lot of the concepts, this is helping me a lot to understand other lectures in some of my courses. Great Job! Many thanks!

  • @emmanuelibrahim6427
    @emmanuelibrahim6427 2 года назад

    Exceptionally gifted with intelligence, and the ability to understand subjects and transfer knowledge. Thank you!

  • @zilaleizaldin1834
    @zilaleizaldin1834 Месяц назад

    waw! You are a great data science who conveys the ideas!!!!!!

  • @mardelfaria2183
    @mardelfaria2183 4 года назад +1

    I learn a lot with your videos. Simple and straigh to the point. Thanks for contributing my dear!

  • @Seilor_12
    @Seilor_12 6 месяцев назад

    Wath an amazing tutorial you managed to give a detailed yet simple explanation, gained a subscriber

  • @jarrelldunson
    @jarrelldunson 4 года назад

    Ritvikmath ... you are a great communicator... thanks

  • @shenfrances
    @shenfrances 4 года назад +1

    I really like your videos! Just followed your channel last night and I am looking forward to more of your future videos!

  • @stephenenke9495
    @stephenenke9495 4 года назад

    Going through Data Science bootcamp at Flatiron School. This was supa' helpful. Thank you!

  • @suparnaprasad8187
    @suparnaprasad8187 9 месяцев назад

    Complete new to ML and this was just amazing. Thank you!

  • @rickmartel8665
    @rickmartel8665 2 года назад

    Very clear presentation. Thanks for keeping it simple.

  • @samwatson6749
    @samwatson6749 4 года назад +1

    yoo this was exactly what I needed! You speak so clearly too thanks man

  • @diegooliveira7713
    @diegooliveira7713 Год назад

    You're amazing! Keep up the great work!

  • @paulbrown5839
    @paulbrown5839 4 года назад +1

    Another great video. Well done!

  • @mohitramana
    @mohitramana 2 года назад

    absolutely amazing, your understanding and your explanation, both

  • @user-or7ji5hv8y
    @user-or7ji5hv8y 4 года назад +1

    Just a suggestion. a series on stochastic processes and possibly even short rate models.

  • @Regelar
    @Regelar Год назад

    really well explained great teacher!

  • @arindamrana
    @arindamrana 2 года назад

    Amazing video ! Nicely explained .

  • @baxoutthebox5682
    @baxoutthebox5682 Год назад

    Amazing explanation

  • @junderfitting8717
    @junderfitting8717 2 года назад

    Please explain what is an ACTIVE Function, you are great!!!!

  • @goelnikhils
    @goelnikhils Год назад

    Amazing Video

  • @Maryfah-cc5zh
    @Maryfah-cc5zh 5 месяцев назад

    very helpful for my exam. thanks

  • @lironsimon4523
    @lironsimon4523 4 года назад +1

    PERFECTLY explained! Thanks🙌

  • @ryanjackson0x
    @ryanjackson0x 3 года назад +1

    How would the minimum of weights to a hinge loss function be determined?

  • @andretan8225
    @andretan8225 3 года назад +1

    do i use 0-1 Loss if I am using a DecisionTreeClassifier?

  • @laurasofiabayona2288
    @laurasofiabayona2288 2 года назад

    Amasing video, thank you!!!!

  • @xinking2644
    @xinking2644 2 года назад +1

    great video!!!awesome!

  • @Ashok-oe3lo
    @Ashok-oe3lo 9 месяцев назад

    thank you ... great explanation

    • @ritvikmath
      @ritvikmath  9 месяцев назад

      Glad it was helpful!

  • @ysjang05050
    @ysjang05050 2 года назад

    Great explanation! Thank you so much!

  • @maciejk2306
    @maciejk2306 4 года назад +2

    Hi! Is there a way to choose a loss function from a predefined list or even to define a loss function by myself to use it with scikit-learn algorithms? Or do I have to write a whole ML algorithm from scratch to change default loss function?

  • @dmitryoksen
    @dmitryoksen 2 года назад

    nice examples!

  • @furkantaspinar355
    @furkantaspinar355 4 года назад

    Thank you, really good explained. You helped me to understand my lecture

  • @EW-mb1ih
    @EW-mb1ih 3 года назад

    Thank you. For the square loss, why don't we heard about cubic loss or quartic loss? Also is the square loss choice also based on its derivative compared to a linear loss?

    • @ChocolateMilkCultLeader
      @ChocolateMilkCultLeader 3 года назад

      Squares are used to avoid negative values. Also the square increases the range, penalizing the worse values. Cubic (or any odd power) will turn negatives. Using hight positive powers might hurt discovery and/or make analysis hard (the values will be too big or small depending on domain). It also gets expensive

  • @Fat_Cat_Fly
    @Fat_Cat_Fly 4 года назад

    incredible !!!! so good!!!!!

  • @pengfeiyang259
    @pengfeiyang259 Год назад

    Perfect! Thank you

  • @Exploringtheworldrightnow
    @Exploringtheworldrightnow 2 года назад

    Can the loss functions be applied to all model indiscriminately? For example, exponential loss function to Support vector classifier? Thanks.

  • @AshokKumar-lk1gv
    @AshokKumar-lk1gv 4 года назад

    very nice lecture

  • @mattmatt245
    @mattmatt245 3 года назад

    Is it possible to apply a custom loss function in a regression model (or any other algorithm for predicting continuous variable) ? I'm working on a stock market prediction model and I need to maximize the following loss function: if [predicted] < [actual] then [predicted] else [-actual]. Would that be possible ? Thanks

  • @sjoerdbodewes3270
    @sjoerdbodewes3270 4 года назад +1

    Thanks, very helpful. Could you please do a video on Structural VARs (SVAR)? I'm really struggling with this concept. Greetings from Holland!

  • @ALG397
    @ALG397 4 года назад +1

    thank you
    we need The practical example with python

  • @anusharajendra389
    @anusharajendra389 Год назад

    Amazing

  • @mannemvenkatesh4004
    @mannemvenkatesh4004 4 года назад

    Hi.. can you make one video on how a white noise effect model while NLP algorithms are used
    Thanks

  • @KoliHarshad007
    @KoliHarshad007 4 года назад

    That catch in the beginning😂😂

  • @yourweebtv8733
    @yourweebtv8733 Год назад

    thank u teacher

  • @hannathunkable5254
    @hannathunkable5254 2 года назад

    POV: youtuber puts in ten times the effort to explain a concept compared to your university professor

  • @Niiiic-x8r
    @Niiiic-x8r Год назад

    OMG this is so useful

  • @xxshogunflames
    @xxshogunflames 4 года назад

    awesome vid!

  • @RonLWilson
    @RonLWilson 2 года назад

    In this video f(xi) is called the score
    and exp (-yi f(xi)) is called the loss,
    so what is y1 (f(xi) called? Does it have a name?

  • @kiannaderi8374
    @kiannaderi8374 11 месяцев назад

    thank you

  • @n.m.c.5851
    @n.m.c.5851 3 года назад

    Thanks

  • @igoroliveira5463
    @igoroliveira5463 3 года назад +1

    almost let the pen fall on this one hahahah

  • @javierargueta7355
    @javierargueta7355 4 года назад +1

    I'm starting in data sciense, i'm not bad at math, butg my question has always been "how do i convert non-number data into number-data?, im noob 😴

    • @pizzaguy8484
      @pizzaguy8484 4 года назад +2

      you should check one-hot encoding, but I don't think it's related to this video at all.

    • @javierargueta7355
      @javierargueta7355 4 года назад

      @@pizzaguy8484 it isn't indeed haha, but i saw no comments and then commented, i'm a latin one, and i prefer english content because im learning tghe language, you comment really helped me , thank you so much

    • @pizzaguy8484
      @pizzaguy8484 4 года назад +2

      @@javierargueta7355 No worries! Good luck with your studies, don't give up!

  • @anaydongre1226
    @anaydongre1226 4 года назад

    Data science is a very vast topic . Any suggestions how to master it