Skip to content

A hybrid imitation learning benchmark. Consists of a dataset, metrics and an evaluation approach

License

Notifications You must be signed in to change notification settings

raphaelmemmesheimer/simitate

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

36 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Simitate: A Hybrid Imitation Learning Benchmark

Simitate Overview

Video

Watch the video

In case the video does not play you can download it here

Citation

@inproceedings{memmesheimer2019simitate,
  title={Simitate: A Hybrid Imitation Learning Benchmark},
  author={Memmesheimer, Raphael and Kramer, Ivanna and Seib, Viktor and Paulus, Dietrich},
  booktitle={2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
  pages={5243--5249},
  year={2019},
  organization={IEEE}
}

Dataset

The root of the data folder can be found here: All Files

The dataset (ca. 423 GB) is available here: Simitate Data. We provide ROS bag file and jpg sequences of RGB and depth camera separately. Timestamps are encoded in the filename. The accompanied CSV files, per sequence, contains ground truth poses for the demonstrator's hand and the objects of interest.

To batch download the files you can use the following command:

This will download e.g. all basic motions sequences:

wget -m --no-parent --reject index.html* https://agas.uni-koblenz.de/data/simitate/data/simitate/extracted/basic_motions

Overview

A tabular scene overview to download individual sequences can be found here:

Benchmark

For benchmarking you need to execute the following steps:

For trajectory quality:

  • Pick a sequence
  • We provide a trajectory_loader class that supports you on collecting your estimates for later evaluation
  • Imitate the visually observed behaviour
import trajectory_loader

tl = trajectory_loader.SimitateTrajectoryLoader()
tl.load_trajectories(trajectory_file)

# add estimates
tl.add_point_to_trajectory("estimated_position"+robot_name, <timestamp>, <pos/pose>)

# after imitation add save the file for getting evaluation results
tl.save_trajectory_to_file(os.path.basename(trajectory_file+".pkl")

The trajectory loader and the evaluation scripts can be found here

Then use the eval_ate.py script like:

python2.7 eval_ate.py <ground_truth_file>.csv <estimated>.pkl

Afterwards you get the corresponding metric result. We provide exemplary estimates from the paper. You can find the files in <simitate>/data/eval/

We provide integration into:

  • PyBullet (focused)
  • Gazebo (minimal)

Simitate is available as a ROS package here:

Simitate Package

PyBullet

PyBullet is mainly supported by Simitate.

Tests were executed in Ubuntu 16.04 and Ubuntu 18.04 in a clean conda environment.

  • Python 2 / Python 3
  • conda create -n simitate python=3.8
  • conda activate simitate
  • pip install -r requirements.txt

then run using to show the ground truth e.g. with the Sawyer robot:

python simitate/simitate_bullet.py -gt -config=configs/config_sawyer.yaml  examples/heart_2018-08-23-18-02-06.csv 

This command should run the simulation of a heart trajectory.

The script looks for estimated trajectories for evaluation, in case none are found the simulated robot executes the ground-truth trajectory.

Further Information:

usage: simitate_bullet.py [-h] [-gt] [-config CONFIG] csvfile

positional arguments:
  csvfile

optional arguments:
  -h, --help      show this help message and exit
  -gt             use ground truth
  -config CONFIG  Config file

The result using the TIAGo robot and the ground-truth should be like this:

TIAGo

And the estimated trajectory using the presented baseline approach:

TIAGo

Gazebo

Perquisites

  • ROS Kinetic
  • Gazebo 7
  • Tested in Ubuntu 16.04

TIAGo

We made experiments with PAL Robotics TIAGo robot in simulation and followed the install instructions found here

PR2

sudo apt install ros-kinetic-pr2-gazebo

Launch

To launch the Gazebo simulator using the Simitate benchmark execute the following steps:

# edit the config/spawn_from_ground_truth.yaml and add the frames of interest
roslaunch simitate gazebo.launch robot:=pr2|tiago
rosbag play <sequence_path>/<sequence_name>.bag

Contributing

This section is meant in case you want to extend the dataset with new sequences. We highly encourage to new additions. This can be:

  • Existing classes in new environments
  • Introducing new tasks
  • New robot models

Recording new sequences

We did our best to provide a framework for extending. Scripts for recording are in the repository.

Prerequisites

  • sudo apt install ros-kinetic-vrpn-client-ros ros-kinetic-vrpn

Steps

For recording you need to execute the following steps:

  • Make sure that you are connected to Ethernet
  • Make sure that the Motive host is connected to Ethernet
  • Make sure that you can reach the machine running Motive (ping 192.168.0.2)
  • Make sure that you enable the VRPN Streaming Engine (View -> Data Streaming Pane -> VRPN Streaming Engine -> Broadcast VRPN Data: )
  • roslaunch simitate record_motion.launch will connect to Motive an visualize the tracked markers.

Playback

Example:

simitate play_motion.launch file:=basic_motions_raphael/take03/circle_2018-08-23-17-55-04.bag

Additional

Camera Calibration

We provide the intrinsic and extrinsic calibration files used during the recording of the sequences. This allows the projection of motion capture data to the images. The camera transformations are contained in the .csv files per sequence as the motion capture system has been recalibrated throughout the recording of the sequences. We recommend using the get_transform_to_world function from the SimitateTrajectoryLoader for reading the transformation.

height: 540 
width: 960 
distortion_model: "plumb_bob" 
D: [0.08358987550663602, -0.14957906731226864, -0.003103469710389675, -0.00031033751957969485, 0.06981523248780676] 
K: [536.005076997568, 0.0, 478.48108901107867, 0.0, 537.8178473127615, 254.99770751448608, 0.0, 0.0, 1.0] 
R: [1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0] 
P: [536.005076997568, 0.0, 478.48108901107867, 0.0, 0.0, 537.8178473127615, 254.99770751448608, 0.0, 0.0, 0.0, 1.0, 0.0]

Extrinsic calibration files are provided here

Objects

Simitate Objects

For reproduction purposes we provide a list of objects used in the experiments. The following objects from IKEA have been used for Simitate:

Name Number Type
365+ 604.063.04 plates
JÄLL 202.428.90 ironing board
BITTERMANDEL 204.323.81 vase
STEKA 926.258.00 pan
PS 2002 303.879.72 watering can
LILLNAGGENA 402.435.96 Shower squeegee
FÖRDUBBLAA 903.459.41 2-piece knife set
HEAT 870.777.00 trivet
ANTAGENA 202.339.61 Dish brush
BLASKA 701.703.29 Dust pan and brush
GNARP 303.358.41 3-piece kitchen utensil set, black
SVAMPIG 602.576.05 sponge
FLUNDRA 401.769.59 dish drainer
FÄRGRIK 003.189.56 mug
VISPAD 602.575.25 colander
GLIS 800.985.83 box with lid
FEJKA 903.751.55 Artificial potted plant
GUBBRÖRA 902.257.31 Rubber spatula

Table: Object list for Simitate

Releases

No releases published

Packages

No packages published

Languages