You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, congratulations on your new paper! I'm really enjoying them!
Do you have any plans to further extend your analysis towards graph convolutional (not ConvNCF -like methods) NN methods?
(Because some of them seem to have same methodological issues you mentioned in your RecSys 2019 paper like this).
I have a question regarding the Gram matrix which is inverted in EASE_R recommender below:
I wanted to examine whether I can speed up the matrix inversion by using Cholesky decomposition method.
However, it complained that the Gram matrix, even after including regularization term, is not positive definite, which is unacceptable.
So I noticed that your Compute_Similarity class sets that similarity between an item and itself to be 0 like this:
Thank you! I am glad you fould our work interesting :)
We hope that our work will increase the awareness regarding how frequent methodological problems are in published research (even in high level venues), such that they may be spotted more easily and that evaluation and reproducibility studies will be more common in our community. The issues that seem to be present in the project you mention (information leakage and erroneous NDCG implementation) are not tied to the nature of the method itself (as opposed to those of ConvNCF) and were already identified in our previous analysis as rather common. We may in the future try to do another study like the RecSys 2019 to see if something has changed.
About your question on EASE_R, you are indeed correct. Thank you for spotting the problem!
The Compute_Similarity class is used for the KNNs and I had forgotten it removes the self-similarity, hence the diagonal. I have updated EASE_R to fix the problem, recomputed all results and updated the analysis.
I'm amazed at how quickly the bugs are fixed and the experiments are updated!
It seems that the fix improves the performance of EASE_R especially for Movielens20M dataset,
which confirms the author's original claim.
We may in the future try to do another study like the RecSys 2019 to see if something has changed.
Great! Looking forward to seeing your future work!
Hi, congratulations on your new paper! I'm really enjoying them!
Do you have any plans to further extend your analysis towards graph convolutional (not ConvNCF -like methods) NN methods?
(Because some of them seem to have same methodological issues you mentioned in your RecSys 2019 paper like this).
I have a question regarding the Gram matrix which is inverted in EASE_R recommender below:
RecSys2019_DeepLearning_Evaluation/EASE_R/EASE_R_Recommender.py
Lines 54 to 59 in 155f8dd
I wanted to examine whether I can speed up the matrix inversion by using Cholesky decomposition method.
However, it complained that the Gram matrix, even after including regularization term, is not positive definite, which is unacceptable.
So I noticed that your
Compute_Similarity
class sets that similarity between an item and itself to be 0 like this:while
X_URM.T.dot(X_URM)
yieldswhich are basically the same except for the diagonals.
So accordingly, I think that the resulting W_sparse differs from the original version of EASE.
The text was updated successfully, but these errors were encountered: