Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Concatenating Additional Features #12

Open
danielressi opened this issue Feb 1, 2019 · 6 comments
Open

Concatenating Additional Features #12

danielressi opened this issue Feb 1, 2019 · 6 comments

Comments

@danielressi
Copy link

danielressi commented Feb 1, 2019

Hi,

Your paper is really interesting and thank you for providing your code.
I am currently trying to add additional features to the RNN input. I don't get any errors but the gpu utilisation drops massively and the computations get very slow. All I do is concatenating a batchsize x nr features tensor.matrix to embedding (SE_item) and provide the input respectively.

My questions are:
Have you tried something similar?
Am I missing something in the concatenation? do I need to adapt "Sin" aswell?

Code snippet of concatenation:
SE_item = self.E_item[X] # sampled item embedding
input_vec = T.concatenate([SE_item, X_additional], axis=1) # X_additonal -> tensor.matrix
vec = T.dot(input_vec, self.Ws_in[0]) + self.Bs_h[0]
Sin = SE_item

Thank you so much for your help

@mquad
Copy link
Owner

mquad commented Feb 6, 2019

Hi @danielressi ,

We tried something similar in the past in our Recsys 2016 paper on Parallel Recurrent Neural Networks.
Simple input concatenation may be not best option, both in terms of speed and recommendation performance. A better option is to have "feature-specific" RNNs and use late feature fusion. The paper shows also that alternated training can be beneficial too.

Hope it helps

@danielressi
Copy link
Author

@mquad thank you so much for your reply. I will have a look at the paper and try out late fusing.

@mmaher22
Copy link

@mquad Is there a repository for "Parallel Recurrent Neural Networks" implementation ?

@mquad
Copy link
Owner

mquad commented Oct 24, 2019

I'm sorry but there's no public implementation of that paper AFAIK

@qingpeng
Copy link

Another relevant question:
What are the "item_embedding" and "init_item_embeddings" options used for? Can we use these options to input the additional features about the items?

Thanks!

item_embedding: int
        size of the item embedding vector (default: None)
init_item_embeddings: 2D array or dict
        array with the initial values of the embeddings vector of every item,
        or dict that maps each item id to its embedding vector (default: None)

@qingpeng
Copy link

Now I figure out that I can use the "init_item_embeddings" options to load the item features/vectors. But is there any way to use the user features/vectors in the package? Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants