-
Notifications
You must be signed in to change notification settings - Fork 70
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Concatenating Additional Features #12
Comments
Hi @danielressi , We tried something similar in the past in our Recsys 2016 paper on Parallel Recurrent Neural Networks. Hope it helps |
@mquad thank you so much for your reply. I will have a look at the paper and try out late fusing. |
@mquad Is there a repository for "Parallel Recurrent Neural Networks" implementation ? |
I'm sorry but there's no public implementation of that paper AFAIK |
Another relevant question: Thanks!
|
Now I figure out that I can use the "init_item_embeddings" options to load the item features/vectors. But is there any way to use the user features/vectors in the package? Thanks! |
Hi,
Your paper is really interesting and thank you for providing your code.
I am currently trying to add additional features to the RNN input. I don't get any errors but the gpu utilisation drops massively and the computations get very slow. All I do is concatenating a batchsize x nr features tensor.matrix to embedding (SE_item) and provide the input respectively.
My questions are:
Have you tried something similar?
Am I missing something in the concatenation? do I need to adapt "Sin" aswell?
Code snippet of concatenation:
SE_item = self.E_item[X] # sampled item embedding
input_vec = T.concatenate([SE_item, X_additional], axis=1) # X_additonal -> tensor.matrix
vec = T.dot(input_vec, self.Ws_in[0]) + self.Bs_h[0]
Sin = SE_item
Thank you so much for your help
The text was updated successfully, but these errors were encountered: