distributed-trainer #1
Esmail-ibraheem
announced in
Announcements
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Add a distributed trainer to train the transformer in multiple GPUs.

This discussion was created from the release distributed-trainer .
Beta Was this translation helpful? Give feedback.
All reactions