ML 14 : Overfitting VS Underfitting | Bias VS Variance | Examples

Поделиться
HTML-код
  • Опубликовано: 6 янв 2025

Комментарии • 16

  • @gayathripamarthi8451
    @gayathripamarthi8451 2 месяца назад +1

    The example you took is soo good!!

  • @divyapawar6269
    @divyapawar6269 Год назад +2

    Thank you for the explanation!Examples made it easy to understand :)

  • @nihalsyd7442
    @nihalsyd7442 Год назад +1

    thank you for explaining in simplest wayyy

  • @moksha333333
    @moksha333333 Год назад +6

    Hello mam i think the last point of what you said about early stopping is something else,ill share what i get please check:
    Early stopping is a technique used to prevent overfitting in machine learning by interrupting the training process before the model has a chance to fully memorize the noise and random variations in the training data.
    The idea behind early stopping is to monitor the performance of the model on a validation set during the training process. The validation set is a separate set of data that is used to evaluate the model's performance, but not used to train the model. The training process is stopped when the performance on the validation set starts to decrease or plateau, indicating that the model has begun to overfit.
    Early stopping can be implemented by monitoring the performance of the model on the validation set at regular intervals during the training process, and interrupting the training when the performance on the validation set starts to decrease or plateau. The model's parameters are then saved at the point where the performance on the validation set was the highest. This point is considered to be the optimal model, as it has not yet begun to overfit.
    Early stopping can be an effective technique to prevent overfitting, but it can also be difficult to implement in practice, especially when the validation set is small or the performance metric is noisy.

  • @vikashtiwari8360
    @vikashtiwari8360 12 дней назад +1

    thank you soo much mam

  • @ashyy2825
    @ashyy2825 26 дней назад

    Over fitting - If You Train Your Model Properly in Training Phase But It's Not Working Properly in testing Phase
    Training Done Properly But Testing Is Not Properly Done
    *Training With More Data 80% gor Train and 20% for Test
    *Removing Unwanted Features - Null Values , Duplicate Values

  • @ashyy2825
    @ashyy2825 26 дней назад

    Under fitting - Model NOT properly Train And NoT Properly Test(reduce Accuracy And Prediction Results)

  • @jagadguru2372
    @jagadguru2372 3 года назад

    Can you asked some problem about overfiting in model ,,,please

  • @ashyy2825
    @ashyy2825 26 дней назад

    OVER FITTING - tries To Cover All Points But Not Using Best FIT line
    Under Fitting -All Points Are Scatterly Present .Not Best Fit Line

  • @m.mashesh1966
    @m.mashesh1966 2 года назад +1

    Video screen is not visible mam