Hello mam i think the last point of what you said about early stopping is something else,ill share what i get please check: Early stopping is a technique used to prevent overfitting in machine learning by interrupting the training process before the model has a chance to fully memorize the noise and random variations in the training data. The idea behind early stopping is to monitor the performance of the model on a validation set during the training process. The validation set is a separate set of data that is used to evaluate the model's performance, but not used to train the model. The training process is stopped when the performance on the validation set starts to decrease or plateau, indicating that the model has begun to overfit. Early stopping can be implemented by monitoring the performance of the model on the validation set at regular intervals during the training process, and interrupting the training when the performance on the validation set starts to decrease or plateau. The model's parameters are then saved at the point where the performance on the validation set was the highest. This point is considered to be the optimal model, as it has not yet begun to overfit. Early stopping can be an effective technique to prevent overfitting, but it can also be difficult to implement in practice, especially when the validation set is small or the performance metric is noisy.
Over fitting - If You Train Your Model Properly in Training Phase But It's Not Working Properly in testing Phase Training Done Properly But Testing Is Not Properly Done *Training With More Data 80% gor Train and 20% for Test *Removing Unwanted Features - Null Values , Duplicate Values
The example you took is soo good!!
Thanks. Keep Learning 😊👍
Thank you for the explanation!Examples made it easy to understand :)
thank you for explaining in simplest wayyy
Hello mam i think the last point of what you said about early stopping is something else,ill share what i get please check:
Early stopping is a technique used to prevent overfitting in machine learning by interrupting the training process before the model has a chance to fully memorize the noise and random variations in the training data.
The idea behind early stopping is to monitor the performance of the model on a validation set during the training process. The validation set is a separate set of data that is used to evaluate the model's performance, but not used to train the model. The training process is stopped when the performance on the validation set starts to decrease or plateau, indicating that the model has begun to overfit.
Early stopping can be implemented by monitoring the performance of the model on the validation set at regular intervals during the training process, and interrupting the training when the performance on the validation set starts to decrease or plateau. The model's parameters are then saved at the point where the performance on the validation set was the highest. This point is considered to be the optimal model, as it has not yet begun to overfit.
Early stopping can be an effective technique to prevent overfitting, but it can also be difficult to implement in practice, especially when the validation set is small or the performance metric is noisy.
correct
nice
thank you soo much mam
You're welcome 😊 Keep Learning.
Over fitting - If You Train Your Model Properly in Training Phase But It's Not Working Properly in testing Phase
Training Done Properly But Testing Is Not Properly Done
*Training With More Data 80% gor Train and 20% for Test
*Removing Unwanted Features - Null Values , Duplicate Values
Under fitting - Model NOT properly Train And NoT Properly Test(reduce Accuracy And Prediction Results)
Can you asked some problem about overfiting in model ,,,please
OVER FITTING - tries To Cover All Points But Not Using Best FIT line
Under Fitting -All Points Are Scatterly Present .Not Best Fit Line
Video screen is not visible mam
It's visible.
@@csittutorialsbyvrushali not visible