This repository implements a semi-autonomous control strategy of an assistive robotic manipulator that utilises eye gaze to acquire information about the intention of the user. A demonstration of a simple pick and place task can be seen below.
The system is based around the following pipeline, in which several subcomponents are employed.
Many of these components were initially developed for this project as separate packages, but all of them can work as standalone nodes. You can find more information in their respective repositories.
Manipulation was implemented with MoveIt2, however, all tests were performed in virtual environment (mainly due to immature state of ROS 2 controllers as of May 2020).
First, clone this repository into your favourite workspace. Furthermore, clone all other repositories listed under ecard.repos.
mkdir -p <awesome_ws>/src && cd <awesome_ws>/src
git clone https://github.com/AndrejOrsula/ecard
vcs import < ./ecard/ecard.repos
- OpenFace 2 (tested with 2.2.0) - follow this sript if in doubt
- librealsense (tested with 2.34.0)
All other dependencies can be installed via rosdep
.
cd <awesome_ws>/src
rosdep install --from-paths . --ignore-src --rosdistro ${ROS_DISTRO}
Build all packages with colcon.
cd <awesome_ws>
colcon build --symlink-install --cmake-args "-DCMAKE_BUILD_TYPE=Release"
First, source the ROS 2 global installation (if not done before).
source /opt/ros/eloquent/setup.bash
Then source the ROS 2 workspace overlay (if not done before).
source <awesome_ws>/install/local_setup.bash
Edit RealSense face camera config and scene camera config to the specific serials.
Finally, you can run the personal calibration and the stack itself.
# Personal calibration
ros2 launch ecard calibrate_eyeball.launch.py
ros2 launch ecard calibrate_kappa.launch.py
# Stack
ros2 launch ecard ecard.launch.py
This project is licensed under BSD 3-Clause License.