Skip to content

michaelchi08/sr_vision

 
 

Repository files navigation

Build Status Circle CI Build Status Documentation Status Code Health codecov.io

sr_vision

Overview

Contains our vision related algorithm (segmentation, tracking, recognition, etc...)

  1. Tracking This package contains the tracker executable. The launch file starts the video acquisition as well as the tracking node and the visualization one. A color parameter is necessary to process the segmentation.
  2. Segmentation Contains nodes for the color and shape based segmentation. The two of them can be set as parameters (default ones are red and strawberry).
  3. Benchmarking Benchmarking package for the different segmentation algorithms.
  4. Visualization Visualization package : display the image from the camera, let the user select a region of interest, and display the tracking.
  5. PointCloud utils This package contains a tracker, a point cloud triangulator, a segmentation tool and a tool to transform point clouds.
  6. Messages All messages, services and actions for sr_vision.
  7. Extrinsic camera calibration Contains nodes for extrinsic camera calibration based on Alvar markers positions in camera and on real robot.

You can find the architecture diagram below for a closer look at how this works.

Architecture Diagram

Usage

For the segmentation, different colors are available : red, blue, green, yellow. A custom one can be added with the calibration script (see below for usage). Furthermore, different shape models : circle, rectangle, star, strawberry, banana, leaf. See the segmentation doc to add personnalize shapes.

With a Kinect

roslaunch sr_object_tracking tracking.launch kinect:=true color:=<color>

With an UVC camera

roslaunch sr_object_tracking tracking.launch color:=<color>

Adding a custom color

To set the proper boundaries in order to process the image (which can be variable according to the luminosity, the camera, etc.) exists a calibration script.

Usage

roslaunch sr_object_tracking calibration.launch

The original image is displayed on the left, and the mask on the right. Hue, Saturation and Value boundaries on which this one is based upon can be changed as convenience with the track bars. Finally, these values can be saved in a yaml file with the switch track bar at the bottom. To use it in the tracking, specify a 'custom' color argument.

About

Contains our vision related algorithm (segmentation, tracking, recognition, etc...)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 46.1%
  • CMake 30.4%
  • C++ 23.5%