This repository has been archived by the owner on Feb 28, 2023. It is now read-only.
Information about final evaluation/submission of your policy #22
luator
announced in
Announcements
Replies: 2 comments 2 replies
-
May I know when the final score will be announced? Regards, |
Beta Was this translation helpful? Give feedback.
1 reply
-
When will the website to submit the report be announced? Regards, |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
This is a reminder that the real robot phase of the challenge ends this Friday,
7th Oct. at 14:00 UTC.
To avoid any surprises with the final evaluation here are some information on
how we will get your submissions and run the evaluation:
At the time of the deadline (or shortly after), we will copy the files you
uploaded to the submission system and clone your code from the
repository/branch which you configured in the
roboch.json
. This means youdon't need to actively submit it anywhere but please make sure that everything
is configured correctly in the end, so that we get the correct version of your
code/policy. It may be a good idea to use a dedicated branch or tag for your
final version and configure that in
roboch.json
before the deadline to avoidaccidentally making any breaking changes before we clone it.
For the evaluation, we are using your policies as specified in the
trifinger.toml
of your repository and simply overwrite thetask
/dataset_type
fields accordingly for the evaluation runs. Therefore,please make sure that you have specified proper policies for all four cases and
that they all work by just selecting the corresponding
task
/dataset_type
andwithout need for any further changes in your code.
Note that if your results in the last evaluation round are as expected (see
leaderboard), then you are most likely already set up correctly.
Please note that you are also asked to provide your full training code used to
train the submitted policy. You can simply add it to the repository
used for the submission system. If this is not possible for some reason you can
also send it per email to [email protected] (please mention the username
of your team in this case).
Apart from the code itself, there should also be instructions on how to install
and run it (e.g. in the README file). Ideally we should be able to run the
training ourselves with the given code and instructions.
Beta Was this translation helpful? Give feedback.
All reactions