首页 > 代码库 > CF with friends and user's influence considered on NYC data(updated Aug,11st)
CF with friends and user's influence considered on NYC data(updated Aug,11st)
Here is the code link:
https://github.com/FassyGit/LightFM_liu/blob/master/U_F1.py
I use NYC data as other experimens.
The split of the training data was seperated by the timeline, and I have normalised the interaction matrix by replacing the checkin frequencies with the checkin frequencies percentage which range between 0 and 1.
And I use this mormalized matrix and lighfm model to train the data, what I got was a little worse than the original data,here is the out come :
The model was trained with warp loss function
I am beginning to model model has been fitted this is the model that consider the checkin times Time used: 4.910935999999992 Train_auc is 0.999486 Test_auc is 0.763801 train_pm_auc is 0.860700, test_pm_auc is 0.685053 /home/s2013258/.local/lib/python3.5/site-packages/sklearn/cross_validation.py:44: DeprecationWarning: This module was deprecated in version 0.18 in favor of the model_selection module into which all the refactored classes and functions are moved. Also note that the interface of the new CV iterators are different from that of this module. This module will be removed in 0.20. "This module will be removed in 0.20.", DeprecationWarning)
In theory, warp loss fuction take the input data as binary, there should not be any difference, but the actual result was a little worse...
I think it should have something to do with the loss that this model use in updates.
Then I use the normalised matrix to achieve collabrative filtering, and up to now, the evaluation metric I use is rmse.
But considering the recommendation object, I will use pre@k instead.
But I have not done that yet...
TBC
CF with friends and user's influence considered on NYC data(updated Aug,11st)