Skip to content

intuitivecomputing/put-that-here

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Put That Here: Situated Multimodal Instruction of Robot Pick-and-Place Tasks

Author: Ji Han, Ze Li, Chien-Ming Huang


Prerequisite

  1. Purple gloves for both hands
  2. Green tablecloth
  3. Top-Down Webcam
  4. UR5
  5. ROS Kinetics enviroment
  6. OpenCV

Installation

git clone https://github.com/intuitivecomputing/put-that-here.git

UR5 Driver

Google Speech-to-Text

Usage

This project supports multimodal robot instruction with four kinds of gestures and various verbal commands. GitHub Logo

Gesture-Recognition

rosrun put_that_here hand_tracking_node.py

Speech-Recognition

rosrun put_that_here Speech_node.py

Autonomous Manipulation

rosrun put_that_here pick_ik.py

Wizard-of-Oz Manipulation

rosrun put_that_here control_group.py

Signaling Feedback

This node should be run on another computer that connected to the projector.

rosrun put_that_here projector.py

Reference

Google Cloud API

python_UR5_ikSlover

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published