Interactive Simon Says Game (Kinect + ROS + turtlebot)

Interactive Simon Says Game (Kinect + ROS + turtlebot)

Simon says, “Right hand up!”

Simon Says is a children game where ‘Simon’ gives directions and then judges the players whether they have followed the directions or not. The players must complete the action if the sentence begins with ‘Simon Says…” and must remain still if just an instruction is given.  We used a TurtleBot for the robot aspect, Robot Operating System (ROS) to interface with the robot, and a Kinect v1 for vision to analyze the players movements. We used several libraries to gather skeletal data from the players, and then analyzed the joint positions of the players to assess whether they followed Simon’s instruction.

Team Testing the Simon Says game!

Team members: Michael Balmes, Zach Smialek, Nhan Tran

Hardware Set up

Turtlebot with ROS Indigo installed in laptop.

 

Microsoft Kinect

Software Integration

We used the Open Natural Interaction (OpenNI) framework that provided the interface needed for both physical devices and middleware components. OpenNi provided an API to help with analyzing the actions of the players. We chose the middleware NiTE v1.5.2.23 to perform skeletal tracking. Additionally we went with a synthetic voice to help give a personality to the TurtleBot in hopes of yielding a more natural interaction from the players. If the players were on a winning streak, the system would compliment ““You’re doing fantastic”. If they were loosing, it would encourage them with “Nice try! I love to watch you play!”

During an outreach event at Mitchell Math and Science Night, I brought the system along with other projects created by the Mines Robotics club and Mines ACM members. With these projects, the volunteer members and I facilitated 3 interactive demonstrations & activities (Simon Says, Maze Solving with Robot, and Challenging the Computer to find the shortest path). We had lots of fun and answered so many interesting questions that the kids and even their parents had.

Demo project at Mitchell Math and Science Night

Back to software integration, we utilized ROS’ tf package to keep track of various coordinate frames over time. Taking advantage of ROS publisher-subscriber model, we created a publisher that send the Kinect data and a tf listener to process the frame transformations.

The skeletal data was used to generate distances and angles between different joints, such as shoulders, hands and head. After the relationships between the joints were defined, the action of the player could be determined. For example, if both hands are higher than the head, the action would be interpreted as arms up.

NiTE’s representation of joints.

 

System requirements:

After setting up the ROS environment, you must install the Kinect Driver

Please follow these steps:

  1. Open terminal and do an apt-cache search of libopenni and install both the -dev and 0 libraries.
sudo apt-get install libopenni0 libopenni-dev
  1. Clone openni_camera and openni_launch from Github into your catkin_ws/src and catkin_make in the workspace folder.
cd ~/catkin_ws/src
git clone https://github.com/ros-drivers/openni_launch
git clone https://github.com/ros-drivers/openni_camera
cd ..
catkin_make
catkin_make install
  1. Connect the Kinect, and run the openni_launch file
roscore
roslaunch openni_launch openni.launch

If the above steps fail, go to this repo and cd into the Bin folder and extract the relevant file

git clone https://github.com/avin2/SensorKinect
cd Bin
tar xjf SensorKinect093-Bin-Linux-x64-v5.1.2.1.tar.bz2
cd Sensor-Bin-Linux-x64-v5.1.2.1
./install.sh
  1. Test out your build by running
roslaunch openni_launch openni.launch
  1. Install NiTE v1.5.2.23
Download the zip (based on your OS)
http://www.openni.ru/openni-sdk/openni-sdk-history-2/index.html
Extract, go to folder, run:
sudo ./install.sh
  1. Install OpenNI Tracker
cd ~/catkin_ws/src
git clone https://github.com/ros-drivers/openni_tracker
cd ..
catkin_make
catkin_make install
source devel/setup.bash
source /opt/ros/indigo/setup.bash
  1. Run OpenNI Tracker after launching OpenNI_Launch
rosrun openni_tracker openni_tracker

Now you should be able to successfully launch OpenNI and listen to all the outputs of the Kinect! You may need to run camera calibration scripts to get ride of edge distortions and the like, but the camera should be all set to go. Then connect the kinect sensor, and run the openni_launch file.

 

Future work:

We hope to collect more Kinect data and skeleton-based representations to have better classification using Support Vector Machine. Running the SVM models with what we had, we got pretty low accuracy of ~63 %. These graphs below were generated using Libsvm.

 

Submit a Comment

Your email address will not be published. Required fields are marked *