Attention
This challenge has ended!
This documentation is only for the Real Robot Challenge 2020 which has ended. Following challenges have their own documentation, see the challenge website for more information.
How to Locally Run Your Code in Simulation¶
You can run your code locally in simulation using the same environment as on the real robot. This way you can verify that everything is working before making an actual submission to the robot.
Requirements¶
A computer running Linux with Python 3 and Singularity (we tested with Python 3.6 and Singularity 3.6, other recent versions may also work, though).
Your code needs to be provided in a git repository following the structure described in Structure of the User Code Package.
The Singularity image used by the submission system. See Download the Real Robot Challenge Image.
The script
run_in_simulation.py
from the rrc_example_package.
Execute Code¶
To execute your code, use the script run_in_simulation.py
. You need to pass
as arguments the path to the output directory where the results will be stored,
the git repository and the singularity image that is used for execution.
Example:
./run_in_simulation.py --output-dir ~/output \
--repository git@github.com:myuser/myrepo.git \
--backend-image path/to/realrobotchallenge.sif
For a list of all options use --help
.
See Complete List of Generated Files for a description of the files that are
written to the specified --output-dir
.
For the repository, you can also specify the absolute path to a
local repository, then you don’t need to push every change to the server before
testing (you still need to commit, though!). In case you are running this from
within the repository directory, you can simply pass --repository $(pwd)
.
You may specify a git branch using --branch
. If not set, the default branch
of the repository is used.
If you are using a modified Singularity image for your code, you need to
specify this with --user-image
. Note that for --backend-image
you
should always use the unmodified standard image that is provided by us, to
ensure that you have the same conditions as on our side.
Visualization¶
You can enable visualization using the --visualize
flag. There are a few
things to consider, though:
You will need to export the
DISPLAY
environment variable into the Singularity container. To do this, execute the following before runningrun_in_simulation.py
(you can put it in your.bashrc
, then you don’t need to remember it every time):export SINGULARITYENV_DISPLAY=$DISPLAY
If running on a machine which uses Nvidia drivers, it may be necessary to also pass the
--nv
flag. See the Singularity documentation on GPU Support.
Limitations¶
There are some limitations to the simulation which you need to keep in mind when using it:
In this setup the simulation unfortunately runs rather slow, so depending on your hardware, the simulated robot may not run at 1 kHz but a bit slower. The camera/object observations are synchronised accordingly.
No camera images are rendered! Rendering of the camera images is too slow and cannot happen in parallel, so it would mess up the timing of the whole setup. Therefore the cameras are disabled in this simulation. The camera observations are still provided as they also contain the object position but the images there are not initialized.