The in-hand-object-tracking project is a suite of applications for in-hand object tracking for the humanoid robot platform iCub.
The suite includes:
- object-tracking: a visual-tactile in-hand object tracker combining partial point clouds and contact points within a 3D model-aided UPF
- object-tracking-viewer: a visualizer that shows the object estimate, the ground truth and the point cloud of the scene
- object-tracking-ground-truth: a marker-based ground-truth module for validation
Run a tracking experiment directly on your browser using GitPod.
See the section Use the suite in a GitPod environment for instructions.
- π Dependencies
- π» Use the suite in a GitPod environment
- π¨ Build the suite
- π Run an experiment
in-hand-object-tracking suite depends on
- BayesFilters -
version >= 0.9.100
- iCub
- iCubContrib
- nanoflann
- OpenCV -
version >= 3.3
- OpenMP (optional)
- Open Asset Import Library, ASSIMP -
version >= 3.0
- SuperimposeMesh -
version >= 0.10.100
- VTK (optional)
- YARP
Important: please use the devel
branch for the libraries BayesFilters
and SuperimposeMesh
.
Tip: if you don't have time to install all the dependecies you can use a GitPod environment directly on your browser, only few steps are required! See the next section for instructions.
In order to use the suite in a GitPod environment, please follow these instructions:
After log-in, the GitPod system will prepare an environment with all the dependencies required by the suite - this may take some time. Once done, you will be prompted to a screen like this.
In order to access to the GUI of the environment:
- Press on the ports button in the bottom-right corner of the screen
- Search for the port 6080 and press on the button Open Browser
- Press on the button Connect
A Linux desktop environment will then be ready for you with all the required dependencies already installed.
You can build the suite using the instructions from the next section.
Use the following commands to build, install and link the library.
CMake
is used to build the suite:
$ git clone https://github.com/robotology/visual-tactile-localization
$ cd visual-tactile-localization
$ mkdir build && cd build
$ cmake [-DUSE_OPENMP=ON] ..
$ make
$ [sudo] make install
The option -DUSE_OPENMP=ON
is optional. If set to ON
, the code is built using the library OpenMP
for multithreaded execution.
The module object-tracking-viewer
requires the library VTK
.
In order to build the object-tracking-viewer
module the following option is required when cmake
is run:
$ cmake -DBUILD_OBJECT_TRACKING_VIEWER=ON ..
The module object-tracking-ground-truth
requires the library OpenCV
to be built with the extra modules and the ArUco
module activated (i.e. with the cmake
option BUILD_opencv_aruco
set to ON
).
In order to build the object-tracking-ground-truth
module the following option is required:
$ cmake -DBUILD_OBJECT_TRACKING_GROUND_TRUTH=ON ..
The tracking algorithm can be tested offline using a dataset provided in the following section.
Please follow these instructions.
If you are using GitPod, it is not required to download the dataset since already available in the Desktop of the environment.
Download the example dataset and unzip it. In the following the extracted folder will be identified as $DATASET
.
-
Open a new terminal and run the YARP server:
$ yarpserver --write
-
On another terminal run the YARP manager:
$ yarpmanager
The YARP manager window will open.
-
Double click on the
Applications
entity in order to show the list of available applications. -
Double click on the application named
Object_Tracking_on_iCub_(Play)
. A new tab will open on the right. -
From the list of modules available within the application
Object_Tracking_on_iCub_(Play)
select the module namedyarpdataplayer
by clicking on it and open it by clicking on the green button as indicated in the following figure. -
The YARP dataplayer will open. This module is required to playback the data stored within the folder
$DATASET
. -
Open the dataset by clicking on
File
, then onOpen Directory
in the menu available in the top of the window. -
A browse dialog will open. Select the folder
$DATASET
and click on the buttonChoose
. If the data is loaded correctly, the dataplayer window should look like the following.Now the data is ready to be played back! π
- Go back to the YARP manager window and open all the remaining modules by clicking on the green button as indicated in the following figure.
-
In few seconds, you should see a green tick on the left of each module name as in the following figure.
Several windows will appear:
- two instances of a YARP viewer, e.g. an image viewer: one shows the images from the left eye of the robot iCub; the other shows the same image with the current bounding box of the object and the convex hull enclosing the robot hand superimposed. Please note that initially the viewers will be black since the experiment is not running yet.
- an instance of a 3D viewer showing a 3D reconstruction of the scene and comprising:
- the point cloud of scene
- the estimate of the object (in gray)
- the ground truth (in transparent green)
- the current pose of the hand of the robot
Please note that initially the viewer will be uninitialized, as shown in the following figure, since the experiment is not running yet.
Now all the required modules are running! π
-
Before starting the experiment, it is required to connect all the modules by clicking on the green button as indicated in the following figure.
-
Look at the bottom of the YARP manager window. All the connections should be reported in green as connected under the column
Status
. If any of the connections is displayed with a red colour as disconnected, please click again on the green button as per step11
until all the connections are green.
In order to run the filtering algorithm open a terminal and connect to the filtering module:
$ yarp rpc /object-tracking/cmd:i
Then start the filtering recursion by typing:
>> run_filter
In case of success, the following response is displayed:
Response: [ok]
To start the experiment press the Play
button on the yarpdataplayer
window as shown in the following figure.
Please note that if you are running the experiment on GitPod, you may experience some performance degradation due to the limited computing capabilities of the environment.
Once the experiment is started, move to the 3D viewer window and press the key R
in order to reset the view. In order to zoom use the mouse scroll wheel. In order to move the point of view, press and keep pressed the left button of the mouse and move until the desired view is obtained. An example 3D view is shown in the following figure.
The YARP viewer window shows the current bounding box enclosing the object, in green, and the convex hull enclosing the robot hand, in red.
In order to restart the experiment, first reset the filtering module by typing:
>> reset_filter
Then, press again the Play
button on the yarpdataplayer
window.