The goal of this project is to link a haptic device (the WEART TouchDIVER), a VR headset (the Oculus Rift S) and a physics simulation software (CoppeliaSim with the MuJoCo engine, or directly MuJoCo) to simulate real-time touch of a deformable object.
This project is conducted at the DIAG Robotics Laboratory of Sapienza University of Rome, under the supervision of Marilena Vendittelli. It is the subject of one of my internship at INSA Rennes, a French engineering school.
The simulator integrates code to retarget the motion of the fingers (usinig the closure and abduction values fromm WEART) to the virtual hand in MuJoCo. This has been done by a group of Sapienza students.
During the project, a way to display MuJoCo in a VR headset has been developed.
You will find the attempts in mjxr_tests. The final, working one is mujoco_openxr.py and is also available on GitHub Gists.
To better understand how everything work, see the paper I wrote.
I made a tutorial on how to convert a 3D model to a deformable material in MuJoCo. It is available here.
Before installing dependencies, remember to create a Python virtual environment!
- WEART Python SDK
- for the TouchDIVER
pip install weartsdk-sky
- CoppeliaSim ZMQ API
- for the simulation
pip install coppeliasim_zmqremoteapi_client
- see the manual
- MuJoCo
- for the simulation
pip install mujoco
- see the manual
- pyopenxr
- for the VR headset and the controllers
pip install pyopenxr
- pynput
- to listen to keyboard press
pip install pynput
- colorama
- for pretty-printing
pip install colorama
- matplotlib
- for real-time performance plots
pip install matplotlib
- numpy
- for fingers motion retargeting
pip install numpy
You can install all of these dependencies at once by executing this command while being in the repository directory:
$ pip install -r requirements.txt
- Open the scene in CoppeliaSim.
- Open the WEART Middleware and connect your TouchDIVER.
- Change the options in
simulator.py
so they match your setup. - Launch the
simulator.py
python file.
- Open the WEART Middleware and connect your TouchDIVERs.
- Wear your TouchDIVERs in the same way as in the following picture:
[!IMPORTANT] Make sure to wear the Oculus Touch Controllers in the same position as in the picture (facing the exterior of the hands).
- Connect your VR device and launch the runtime program (Meta Quest Link for instance).
- Change the options in
simulator.py
so they match your setup. - Launch the
simulator.py
python file.
The project is currently written in pure Python code and depends on cross-platform libraries. It is therefore also cross-platform.
However, to use a TouchDIVER, the WEART Middleware must be opened and this software is Windows-only. There is fortunately a workaround in the following section.
If you have a secondary computer with Windows installed, it is possible to run the simulation on a primary (Linux for example) computer:
- Launch CoppeliaSim on the primary computer
- Launch WEART Middleware on the Windows computer
- Link the primary and the secondary computers and allow the primary one to interact with the loopback-bound 13031 TCP port
- Either by putting them in the same network and using a tool like socat to redirect the TCP port
- Either by using ZeroTierOne as an easy solution (untested, and you will still need to setup the TCP redirection)
- Either by setting up tunnels if there is no way to do the above
- This depends on your network configuration.
- If both computers are "hidden" from each other, for instance between two NATs, you can use a third publicly accessible machine (for instance a cloud VPS) as a relay. Setup a reverse SSH tunnel between the Windows and the relay which exposes the port 13031 to a public one, and make the Linux client connect to the relay's port.
ssh -N -4 -R <relay-port>:localhost:13031 <user>@<relay address>
(the-4
switch is necessary on Windows, see this issue)
Use-case: CoppeliaSim runs much faster on Linux.
You can plug any simulation scene you want. The only requirements are:
- the hands mocap bodies must follow the names
{side}_hand_mocap
, where{side}
is left or right. - the hands "real" bodies must follow the names
{side}_hand
and their rotations must be expressed with theeuler
parameter. - the fingertip sensors must be of type
contact
and follow the names{side}_fingertip_{finger}
, where{finger}
is thumb, index or middle.