Backprogagation calculates both error and gradients. > First the error is calculated. ( I think this is considered the first step in the backpropagation). > Next the gradients are re-computed or re-calculated to reduce the errors. Re-calculation is calculation as well , its pretty obvious. Updation and re-calculation are synonyms. Please correct me if I'm wrong.
Every week ❤
It's very helpful
Glad it helped. Keep join next live quizzes as well
Backprogagation calculates both error and gradients.
> First the error is calculated. ( I think this is considered the first step in the backpropagation).
> Next the gradients are re-computed or re-calculated to reduce the errors.
Re-calculation is calculation as well , its pretty obvious. Updation and re-calculation are synonyms.
Please correct me if I'm wrong.
This is very good, thank you though am new to data science
Welcome, please join live next week
please continue this series also add some more data science libraries and frameworks
Sure - please join live this weekend
Please create a Quiz on Boosting Algorithms, Aman.
XGBoost, ADABoost , CATBOOST etc . lets dig deep into them..
Loss
Thanks