Hand

let’s high five, heather!

Vision

   hey! there is a hand

https://youtu.be/woM99mlMBpM

Our main goal in this Roboy Hand Project is to recognize gestures and mimic those gesture by executing via Roboy’s hand.

Our Goal

this semester we want roboy to…

classify gestures
localize hands
execute gestures

Abstract

how we did it

Gesture Recognition

The aim of this part is to recognize a set of gestures. We decided to use deep learning architectures by using a ROS package which was a pretrained package for shape recognition. In order to train a good model, we need a good dataset. We are currently using a dataset (link) which provides both RGB and depth images.

Gesture Execution

Recognition part will be connected to gesture execution via ROS. We have implemented the gesture execution to be able to receive the unique ID of different gestures. The received ID will first be sent to the ROS service client. Here, the client will send the unique ID to a service server to receive the joint angles of each finger. Then, this information will be sent back to the client.

Results

   roboy can now…

https://youtu.be/uKjYwrlTxno

classify gestures

10 different gestures by using a trained neural network

localize hands

localize hands

localization & classification work in real time

execute gestures

execute gestures

in simulation with torque control

The Team

get to know the hand team

Team members SS2018

Simon Trendel (Team Lead)
Abhimanyu Sharma (Agile Coach)
Baris Yazici
Bilal Vural
Kai Wu

Links

codes, documentations & presentations

Where to go next

a hand for more tasks

  • Better Neural Network:
    expand dataset with images from different angles
  • Real Hand Execution:
    use learning algorithm for control
  • Stronger Hand:
    more powerful hand to perform more tasks
  • Grasp Objects