Atom is a set of calibration tools for multi-sensor, multi-modal, robotic systems.
It is based on the optimization of atomic transformations as provided by a ros based robot description. Moreover, ATOM provides several scripts to facilitate all the steps of a calibration procedure.
Table of contents generated with markdown-toc
Unlike most other calibration approaches, ATOM offers tools to address the complete calibration pipeline:
-
Prepare you environment (OPTIONAL) - define environment variables for easy cross machine access to data
export ROS_BAGS="$HOME/bagfiles" export ATOM_DATASETS="$HOME/datasets"
-
Create a calibration package for you robotic system
rosrun atom_calibration create_calibration_pkg --name <your_robot_calibration>
-
Configure your calibration package - edit the file <your_robot_calibration>/calibration/config.yml with your system information.
rosrun <your_robot_calibration> configure
-
Set initial estimate - deployment of interactive tools based on rviz that allow the user to set the pose of the sensors to be calibrated, while receiving visual feedback;
roslaunch <your_robot_calibration> set_initial_estimate.launch
-
Collect Data - Extraction of snapshots of data (a.k.a., collections)
roslaunch <your_robot_calibration> collect_data.launch output_folder:=~/datasets/<my_dataset>
-
Calibrate sensors - finally run an optimization that will calibrate your sensors:
roslaunch <your_robot_calibration> calibrate.launch dataset_file:=~/datasets/<my_dataset>/data_collected.json
To calibrate your robot you must define your robotic system, (e.g. <your_robot>). You should also have a system description in the form of an urdf or a xacro file(s). This is normally stored in a ros package named <your_robot>_description.
Finally, ATOM requires a bagfile with a recording of the data from the sensors you wish to calibrate. Transformations in the bagfile (i.e. topics /tf and /tf_static) will be ignored, so that they do not collide with the ones being published by the robot_state_publisher. Thus, if your robotic system contains moving parts, the bagfile should also record the sensor_msgs/JointState message.
It is also possible to record compressed images, since ATOM can decompress them while playing back the bagfile.
To start you should create a calibration ros package specific for your robot. ATOM provides a script for this:
rosrun atom_calibration create_calibration_pkg --name <your_robot_calibration>
This will create the ros package <your_robot_calibration> in the current folder, but you can also specify the folder, e.g.:
rosrun atom_calibration create_calibration_pkg --name ~/my/path/<your_robot_calibration>
Once your calibration package is created you will have to configure the calibration procedure by editing the <your_robot_calibration>/calibration/config.yml file with your system information. Here is an example of a config.yml file.
After filling the config.yml file, you can run the package configuration:
rosrun <your_robot_calibration> configure
This will create a set of files for launching the system, configuring rviz, etc.
Iterative optimization methods are often sensitive to the initial parameter configuration. Here, the optimization parameters represent the poses of each sensor. ATOM provides an interactive framework based on rviz which allows the user to set the pose of the sensors while having immediate visual feedback.
To set an initial estimate run:
roslaunch <your_robot_calibration> set_initial_estimate.launch
Here are a couple of examples:
Atlascar2 | AgrobV2 |
---|---|
UR10e eye in hand | ... |
---|---|
... |
To run a system calibration, one requires sensor data collected at different time instants. We refer to these as data collections. To collect data, the user should launch:
roslaunch <your_robot_calibration> collect_data.launch output_folder:=<your_dataset_folder>
Depending on the size and number of topics in the bag file, it may be necessary (it often is) to reduce the playback rate of the bag file.
roslaunch <your_robot_calibration> collect_data.launch output_folder:=<your_dataset_folder> bag_rate:=<playback_rate>
Here are some examples of the system collecting data:
Atlascar2 | AgrobV2 |
---|---|
... | UR10e eye to_base |
---|---|
... |
A dataset is a folder which contains a set of collections. There, a data_collected.json file stores all the information required for the calibration.
Finally, a system calibration is called through:
roslaunch <your_robot_calibration> calibrate.launch dataset_file:=~/datasets/<my_dataset>/data_collected.json
You can also define the following additional arguments:
single_pattern:=true
show a single pattern instead of one per collection.
use_incomplete_collections:=true
Remove any collection which does not have a detection
for all sensors.
-ssf "SENSOR_SELECTION_FUNCTION"
A string to be evaluated as a lambda function that
receives a sensor name as input and returns True or
False to indicate if the sensor should be loaded (and
used in the optimization). The Syntax is lambda name:
f(x), where f(x) is the function in python language.
Example: "lambda name: name in ["left_laser",
"frontal_camera"]" , to load only sensors left_laser
and frontal_camera
-csf "COLLECTION_SELECTION_FUNCTION"
A string to be evaluated into a lambda function that
receives a collection name as input and returns True
or False to indicate if the collection should be
loaded (and used in the optimization). The Syntax is
lambda name: f(x), where f(x) is the function in
python language. Example: "lambda name: int(name) > 5" ,
to load only collections 6, 7, etc .
- Miguel Riem Oliveira - University of Aveiro
- Afonso Castro - University of Aveiro
- Eurico Pedrosa - University of Aveiro
- Tiago Madeira - University of Aveiro
- André Aguiar - INESC TEC
- Miguel Riem Oliveira - University of Aveiro
- Eurico Pedrosa - University of Aveiro