You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
def find_lengths(messages: torch.Tensor) -> torch.Tensor:
"""
:param messages: A tensor of term ids, encoded as Long values, of size (batch size, max sequence length).
:returns A tensor with lengths of the sequences, including the end-of-sequence symbol (in EGG, it is 0).
If no is found, the full length is returned (i.e. messages.size(1)).
This leads to counterintuitive behaviour in which, if max_len is 3, [1, 2, 3] and [1, 2, 0] have the same length.
The text was updated successfully, but these errors were encountered:
Is there any update on this issue? I am also working on variable-length communication, and adding an EOS token to each message (see three code lines below) causes the lengths to be of length: opts.max_len + 1 when the sender itself produces no EOS token. This is a bit counterintuitive when one specifies max_len to be a specific value and observes a length returned by the find_lengths function that is longer than the specified value.
def find_lengths(messages: torch.Tensor) -> torch.Tensor:
"""
:param messages: A tensor of term ids, encoded as Long values, of size (batch size, max sequence length).
:returns A tensor with lengths of the sequences, including the end-of-sequence symbol (in EGG, it is 0).
If no is found, the full length is returned (i.e. messages.size(1)).
This leads to counterintuitive behaviour in which, if max_len is 3, [1, 2, 3] and [1, 2, 0] have the same length.
The text was updated successfully, but these errors were encountered: