ML 14 : Overfitting VS Underfitting | Bias VS Variance | Examples

Поделиться
HTML-код
  • Опубликовано: 19 ноя 2021
  • Connect with me by:
    LIKE & SHARE Videos with your friends.
    SUBSCRIBE ‪@csittutorialsbyvrushali‬
    Instagram: / cs_and_it_tutorial_by_...
    Facebook: / cs-it-tutorials-by-vru...
    DATABASE MANAGEMENT SYSTEM:
    • Database Management Sy...
    MACHINE LEARNING USING PYTHON:
    • Machine Learning Using...
    DATA STRUCTURE & ALGORITHMS:
    • Data Structure (FDS / ...
    HUMAN COMPUTER INTERACTION :
    • Human Computer Interac...
    SYSTEM PROGRAMMING / COMPILER DESIGNING:
    • System Programming / C...
    PROCESSOR ARCHITECTURE & INTERFACE:
    • Processor Architecture...
    EXAM / INTERVIEW PREPARATION:
    • Exam / Interview Prepa...
    PROJECT/ PRESENTATION & FREE CERTIFICATION IDEAS:
    • Project, Presentation ...
    Keep Watching..!
    Keep Learning..!
    Thank You..!
    #machinelearning #overfitting #underfitting #bias #variance #machinelearningalgorithm #machinelearningbasics #machinelearningfullcourse #ml #machinelearningwithpython #machinelearningengineer #machinelearningtutorialforbeginners #machinelearningtutorial #csandittutorialsbyvrushali #vrushali #trending #trendingtopic
  • НаукаНаука

Комментарии • 8

  • @jagadguru2372
    @jagadguru2372 2 года назад

    Can you asked some problem about overfiting in model ,,,please

  • @moksha333333
    @moksha333333 Год назад +4

    Hello mam i think the last point of what you said about early stopping is something else,ill share what i get please check:
    Early stopping is a technique used to prevent overfitting in machine learning by interrupting the training process before the model has a chance to fully memorize the noise and random variations in the training data.
    The idea behind early stopping is to monitor the performance of the model on a validation set during the training process. The validation set is a separate set of data that is used to evaluate the model's performance, but not used to train the model. The training process is stopped when the performance on the validation set starts to decrease or plateau, indicating that the model has begun to overfit.
    Early stopping can be implemented by monitoring the performance of the model on the validation set at regular intervals during the training process, and interrupting the training when the performance on the validation set starts to decrease or plateau. The model's parameters are then saved at the point where the performance on the validation set was the highest. This point is considered to be the optimal model, as it has not yet begun to overfit.
    Early stopping can be an effective technique to prevent overfitting, but it can also be difficult to implement in practice, especially when the validation set is small or the performance metric is noisy.

  • @divyapawar6269
    @divyapawar6269 Год назад +1

    Thank you for the explanation!Examples made it easy to understand :)

  • @nihalsyd7442
    @nihalsyd7442 8 месяцев назад +1

    thank you for explaining in simplest wayyy

  • @m.mashesh1966
    @m.mashesh1966 Год назад +1

    Video screen is not visible mam