You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, thanks for your contribution. I have a question in the implementation process. For molcpba, firstly we use graphormer to pretrain on CQM4Mv2 , which in num_class is 1. Thus, the size is torch.Size([1, 1024]). However, when we finetune on the molpcba, the num_class is 128,size is torch.Size([128, 1024]). Thus their size is mismatch. How do we address it? Thanks.
The text was updated successfully, but these errors were encountered:
Hi, thanks for your contribution. I have a question in the implementation process. For molcpba, firstly we use graphormer to pretrain on CQM4Mv2 , which in num_class is 1. Thus, the size is torch.Size([1, 1024]). However, when we finetune on the molpcba, the num_class is 128,size is torch.Size([128, 1024]). Thus their size is mismatch. How do we address it? Thanks.
The text was updated successfully, but these errors were encountered: