Simple Popularity based Recommendation, Collaborative Filtering using Matrix(2D) , Content Based Filtering. These 3 methods are mainly described in the video
At 1:17:00 If your values are continuous you can keep a threshold for eg: thres = 2 5 - 4 < thres is considered close whereas 5 - 2 > thres it's not close. Now, you have a binary matrix for precision and recall
You made me undesratnd the chronology of things are suppose to be in A Recommender System and m very grateful Thank You very much Jill.. Gonna go read your medium posts😉
In the last method of Matrix Factorization SVD was used. Can an Autoencoder be used ? Will it be computationally very expensive? Also will the quality of recommendation get any better or stay similar?
Dear Jil, Your tutorial is extremely helpful and well presented - well done! However, I do have a small question about the section where You discuss sparsity. In most documentation I see that sparsity is measured by getting the quotient of number of zero or missing values / total number of elements in the matrix. However, You're taking the number of ratings (rather than missing values) into the formula - why? Considering that we have 610x9724=5,931,640 elements and 100,836 ratings - it would seem that our sparsity should be quite big (as most data is missing)! Also, could You point to any documentation which discussed at which levels of sparsity the CF methods should be used? Best regards, Pawel.
matrix sparsity of MovieLens 20M Dataset using your code is 0.54% where my findings imply the matrix sparsity is 100-0.54=99.46% which makes more sense after all. I dont understand. isnt the formula sparsity=(#zero valued elements)/(#total elements) ?
Simple Popularity based Recommendation, Collaborative Filtering using Matrix(2D) , Content Based Filtering. These 3 methods are mainly described in the video
At 1:17:00 If your values are continuous you can keep a threshold for eg: thres = 2
5 - 4 < thres is considered close whereas 5 - 2 > thres it's not close. Now, you have a binary matrix for precision and recall
You made me undesratnd the chronology of things are suppose to be in A Recommender System and m very grateful Thank You very much Jill.. Gonna go read your medium posts😉
17:00 Calculating the average rating
In the last method of Matrix Factorization SVD was used. Can an Autoencoder be used ? Will it be computationally very expensive? Also will the quality of recommendation get any better or stay similar?
45:25 - Importance of matrix Sparsity
Dear Jil, Your tutorial is extremely helpful and well presented - well done!
However, I do have a small question about the section where You discuss sparsity.
In most documentation I see that sparsity is measured by getting the quotient of number of zero or missing values / total number of elements in the matrix.
However, You're taking the number of ratings (rather than missing values) into the formula - why?
Considering that we have 610x9724=5,931,640 elements and 100,836 ratings - it would seem that our sparsity should be quite big (as most data is missing)!
Also, could You point to any documentation which discussed at which levels of sparsity the CF methods should be used?
Best regards,
Pawel.
You're right, I found 5830804 empty elements out of 5931640 which means the sparsity is 98.30% not 1.7%
these 2 techniques are super old. Like Pre-2018 Era. Most systems like Tiktok, Instagram and etc use a two tower approach instead...
Did you find a good reference for that approach that you could share?
yeah but it depends on the company
matrix sparsity of MovieLens 20M Dataset using your code is 0.54% where my findings imply the matrix sparsity is 100-0.54=99.46% which makes more sense after all. I dont understand. isnt the formula sparsity=(#zero valued elements)/(#total elements) ?