Owner: Alan Hong, Northwestern University, MSR 2016
Project write-up available here.
Table of Contents
1. Package Description
2. ROS Package Dependencies
3. Launch Files
4. Notes on Package Usage
The liam_neeson package combines an ASUS Xtion PRO LIVE camera (or equivalent, e.g. Kinect) and Baxter robot to visually track and interact with a pre-determined object. The demo implemented commands the robot to continually point at the object as the user moves the object around in the camera's field of view.
A trained Haar classifier, in conjunction with time and color filters, is used to locate the object in the camera's RGB image feed, which is combined with a rectified depth image to determine the object's real-world coordinates. These coordinates are then published through the /obj_position
topic.
The coordinates, once recieved by the robot controller, are adjusted to be expressed from the camera frame to the robot frame. The robot may then use these adjusted coordinates to perform tasks.
This package includes the following major components:
- Script that performs robust object detection and segmentation, and extracts the object's real-world coordinates.
- Demo that points the robot's end-effector towards the object at regular intervals.
- Launch file to set up the environment and run all of the above.
In order to successfully run the launch files, you must have the following sets of packages available to the workspace:
- Baxter SDK - Contains necessary robot interface packages such as baxter_moveit_config, baxter_interface, baxter_examples, etc.
- OpenNI - Contains RGBD camera interface packages
- The overarching launch file of the project. Calls the necessary robot and camera launch files, runs the object classifier and robot control scripts.
- Launches nodes relevant to Baxter and its interface
- openni.launch will start multiple nodes under the
/camera
namespace