Join the ROBOY TEAM 

WS18/19

roboy_on_the_bike_web

The goal of the upcoming semester is ROBOY RIKSHAW. One of the underlying foundations to reach this goal is autonomous driving on the electric tricycle. As a multi-faceted project, it consists of numerous components:

  • Simultaneous localization and mapping (SLAM);
  • Obstacle detection on camera and LiDAR data
  • Sensor fusion
  • Tricycle motion model creation and path planning
  • Collision avoidance
  • Implementation (followed by deployment on Roboy) of the reliable control algorithms for steering and pedaling;
  • Autonomous driving safety system;
  • Tricycle state tracking;
  • Tricycle remote control (for emergency cases);
  • Tricycle ergonomic feature design (passenger seat & sensors, maneuvering and warning signals, safety gear);
  • Roboy mobile battery system.

 

In addition to our main semester goal we will continue developing Roboy core technologies

and explore new areas.

 

ROBOY TEAM PROJECTS

for WS18/19

The Roboy Team is always looking for the best and most dedicated students to help us achieve great things for the future. We guarantee a rewarding environment in which you want to work and offer challenging, international and future-oriented tasks.
The following symbols represent your direction of studies. 

♣ Informatics
♦ Electrical Engineering
α Mechanical Engineering
♥ Robotics
♠ Industrial Design

Audio_landscape2

Interaction

  • The goal of this project is to utilize the range of emotions Roboy can show during the interaction with the environment, primarily  conversations with people. Currently Roboy’s face can show around twenty emotions, including happy, shy, irritated, and hypnotizing eyes. You will work with Roboy Dialog System, Roboy Unity Face, and Deeply SDK. The expected result is an algorithm deployed on Roboy and Roboy Telegram bot, that takes as an input user utterance, user emotion, intended Roboy response and yields a naturally fitting facial expression.

    Skills:

    • Python and Java programming skills
    • Unity 3D
  • With the rise of generative deep neural networks, the quality of speech synthesis has improved vastly. We plan to upgrade Roboy’s text-to-speech module by implementing and training a DNN (inspired by Tacotron, Wavenet), that has a unique Roboy’s voice (data to be collected) and exhibits natural intonation, tone, stress, and rhythm.

    Skills:

    • deep learning experience
    • advanced Python
    • Tensorflow/Pytorch
    • basic understanding of audio signal processing
  • Open-domain nature of the Roboy conversations is the main challenge in building a well-performing dialog system. Since users ask a range of Roboy-related questions, general-knowledge question along with making statements about themselves, it is not always trivial to find a proper response.  At the moment Roboy Dialog System heavily relies on the hand-crafted patterns, that hinder the robustness and flexibility of the dialog flow. To improve the performance, we will next use neural open information extraction approach (with an encoder-decoder framework), subsequently connected to an ontology. Finally, the intent will be defined for each user input using the collected data.

    Skills:

    • natural language processing techniques understanding and experience
    • advanced Python
    • Tensorflow/Pytorch
  • This module will be integrated into Roboy Dialog System in order to improve the variability of Roboy’s responses, including sentence structure and use of synonyms. In addition, it should from full sentence based on the information retrieved from the knowledge base, which is usually presented in the form of subject-object-predicate triples.

    Skills:

    • solid Java skills
    • natural language processing techniques understanding and experience
  • The design objective is to develop an integrated system for autonomous rigid-object pick-up tasks in domestic environments, focusing on the gripping of unmodeled objects and exploiting sensor feedback from the robot hand to monitor the grasp. We design the perception system based on time-of-flight range data, the grasp pose optimization algorithm and the grasp execution. The performance and robustness of the system are validated by experiments including pick-up tasks.

    Skills: 

    • Knowledge ROS and basic ARDUINO programming;
    • Very good knowledge of the Python/C++/MATLAB programming language and its scientific computing libraries;
Muscle_landscape
Sensing

Sensing

  • Another recent advancement in our in-house hardware is the new motor board. The unique feature of the new motor-driver board is the Simic microphones (because why not). One of the exploratory tasks for the upcoming semester, will be predictive maintenance and audio analysis of the muscle status based on the collected audio data. This will include building a test rig using new motorboards, setting up communication to receive microphone’s data, as well as the subsequent filtering and analysis.

    Skills:

    • Embedded programming
    • Signal processing experience
  • In order to achieve a highly robust control of a tendon-driven robot, precise knowledge of force distribution is of utter importance. Therefore, in-place tendon force measurement is among priority tasks for the next semester. Since the approach is not fixed, a proper research on the topic will be a crucial part of the project. The expected result is a scalable way to precisely measure tendon forces that is integrated in the Roboy 2.0.

    Skills

    • knowledge of basic machining techniques, fabrication methods, materials;
    • basic CAD skills
    • PCB design 
    • Embedded programming
  • A low-cost  and  easy  to fabricate  3-axis  tactile  sensor  based  on  magnetic  technology. The  sensor  consists  in  a  small  magnet  immersed  in  a  silicone body or flexible 3D printed material with an Hall-effect sensor placed below to detect changes in  the  magnetic  field  caused  by  displacements  of  the  magnet, generated  by  an  external  force  applied  to  the  deformable body. The  use  of  a  tri-axis magnetometers  allows  to  detect  the three components of the force vector, and the proposed design assures  high  sensitivity,  low  hysteresis  and  good  repeatability of  the  measurement.

    Goals:

    • An innovative, functional method to obtain contact and force estimates on robot surfaces.
    • A working tactile sensing platform that is able to isolate and detect contact and possibly forces.
    • Complete documentation of the platform.

    Desired skills:

    • Knowledge of basic machining techniques, fabrication methods, materials;
    • Knowledge of PCB design;
    • Knowledge of the Python/MATLAB programming language and its scientific computing libraries.
  • During the past semester we have been developing a pressure-sensitive silicone skin, which is based on flexible, planar silicone waveguides. During the upcoming semester we will expand the current setup to use modulated light and new materials, allowing to transmit data through the skin as well as having more a robust pressure localization. As an option this would also enable us to make the skin translucent in order to show visual feedback with embedded LEDs under it. In addition, since the Roboy Rikshaw has to properly function outdoors, the next version of the skin has to serve as a waterproof shield for his body, and rapidly adapt to changes in ambient light.

    Skills:

    • Knowledge of basic machining techniques, fabrication methods, materials;
    • PCB design
    • Embedded programming 
    • Signal processing experience
  • Haptic feedback of artificial agents such as robotic hands or virtual hands in VR facilitate a more natural interaction for the user. Haptic grippers or gloves can be used to control th virtual hands in VR, allowing a more natural use of the hand. They can be used for puppeteering robotic hands, which makes testing easier and hence iterative prototyping faster. This allows a more natural physical interface for the end user.

    Skills:

    • Knowledge of basic CAD software and materials;
    • Knowledge of basic ARDUINO programming;
    • Knowledge of the Python/MATLAB programming language and its scientific computing libraries;
  • Human motion tracking is a powerful tool used in a large range of applications that require human movement analysis. The objective is to design a set of wearables for human joint tracking for accurate whole-body system and kinematics estimation using the light-house tracking technique, previously used in Roboy.

    Skills:

    • Knowledge of basic CAD software and materials;
    • Knowledge of basic ARDUINO programming;
    • Knowledge of Python/C++ programming languages and its scientific computing libraries;
Electronics_Roboy

Actuation

  • This project will provide an essential tool for the robot control and interaction. It will mostly be dealing with the robot bring-up and initialization routines and the user interface for robot state monitoring.

    Desired skills:

    • ROS
    • Javascript/Python programming skills
  • Striving for a better and more human-like performance, we have added the improvement of the shoulder joint to the set of our semester goals. This project involves an exhaustive testing of the tendon routing using various configurations, modifying the joint, designing a soft joint capsule (inspired by the human articular capsule), as well as updating the current backbone to assimilate the above mentioned changes.

    Skills:

    • Knowledge of basic machining techniques, fabrication methods, materials;
    • CAD experience
  • While building a full-size tendon-driven humanoid, we have developed our own series elastic actuator (MyoBrick). The new muscle units require stress- and durability testing, followed by the improvements based on the collected data. For example, a new design should incorporate a motor cooling component, such as outer casing made of heat conducting silicon that will serve as a passive cooling element.

    Skills:

    • Knowledge of basic machining techniques, fabrication methods, materials;
    • Python programming skills
    • CAD
  • Inspired by the recent advancements in deep reinforcement learning, the idea behind this project is to employ the above approach in the control of the Roboy’s arm, in order to perform model-free reach and grasp movements. The subtasks include:

    • implementation of the interface between the simulation environment (Gazebo) and the learning agent;
    • design of the network, including state vector and reward function definition;
    • training and subsequent deployment in hardware.

    Skills:

    • Pytorch/Tensorflow
    • theoretical background in deep reinforcement learning 
    • experience with Gazebo (and/or) Opengym, ROS
  • The design and manufacture of a low cost, modular robotic hand for Roboy. A fully actuated hand is to be developed with individually actuated fingers. It should be able to perform everyday grasps and independently point the index. The weight and speed (closing time: 1.5 seconds) should be comparable to actual commercial prostheses. It should be able to lift a 3 kg bag and stably grasp up to 2 kg cylindrical objects. The hand will have 2 degrees of actuation (2 motors) with a spring return system. The selection of each finger is done via electromagnets while actuation is via motors.

    Skills:

    • Knowledge of basic machining techniques, fabrication methods, materials;
    • Knowledge of the mechanics or control of multi-finger manipulation;
    • Knowledge of the Python/MATLAB programming language and its scientific computing libraries;
VR_landscape
Exploration

Exploration

  • Roboy is already alive in virtual reality. RoboyVR provides various features that enable the user to not only observe the virtual Roboy but also interact and manipulate it.
    The existing features range from state visualization to nerfgun shooting, playing a xylophone and also playing with multiple Roboys in VR.
    Virtual reality is fun but it is limited to your physical space, what if you had a playground with unlimited space?

    That’s what RoboyAR is all about. Build a similar experience to RoboyVR but with the unlimited possibilities of augmented reality.
    With AR being available on numerous mobile devices (incl. smartphones) almost everybody will be able to enjoy Roboy anywhere at anytime.

    Skills needed:

    • Unity 3D
    • C# knowledge
    • Preferable AR experience
    • Basic network understandig
    • Creativity and surgency

    Project goals:

    • Porting RoboyVR to AR
    • Implementing enjoyable AR features to use
    • Creating fancy visuals
    • All the fun stuff you can think of
  • Ever wanted to drive a tricycle, but you are too afraid to hurt yourself?
    Ever wanted to drive other people around, but you do not want to be a taxi driver?
    Ever wanted to command a robot that follows your will?

    If your answer to all of these questions is yes, then TeleRickscha is your way to go!
    With the power of virtual reality and your own hands it is your turn now to remote control the first ever Rickscha on campus.
    But you will not only have the power over the tricycle but also over the driver, the great Roboy himself!
    This experience will not only make you happy but hopefully also the passenger sitting in the back of the Rickscha which could be your classmate.

    Skills needed:

    • Unity 3D
    • C# knowledge
    • Preferable VR experience
    • Basic network understanding
    • Driving license is not required(!!)

    Project goals:

    • Send control commands from VR over a network layer to the physical Rickscha. (Steering, accelerating, braking, etc.)
    • Visualize Roboy’s camera stream and turn his head to look into different directions.
    • Visualize Rickscha data like battery charge, current speed, etc.
    • Gesture control of Roboy’s arms, waving for example to get the attention of potential passengers.
    • Communicate with the passenger over the network.
    • Telegram interface, receive messages/ calls from people on campus that need a ride.
  • Roboy giving you a ride on his Rickscha should be a pleasent experience for the both of you.
    Being a happy passenger is depending on your feeling of safety, comfort and the current level of boredom.
    With RickschaCommander your chances of being a happy passenger are much higher because it allows you to override
    Roboy’s behaviour (safety). When the travel speed should not meet your expectations (comfort), just change it at your fingertips!
    You are going the wrong way? Just change the route via the build in navigation system!
    Using RickschaCommander also reduces the risk of boredom on your journey as you continously interact with Roboy!

    Skills needed:

    • C++ knowledge
    • Basic level of graphical design
    • Basic network understandig

    Project goals:

    • UI operating on a mobile device(smartphone/ tablet)
    • UI visualizing battery status, speed, powerconsumption, etc.
    • Override switch to overrule Roboy’s autonomous driving (stop, reset, etc.)
    • Displaying and manipulating of the rickscha’s motion planning (Navigation system)
    • Manual mode control of tricycle movement (accelerate, brake, steer left and right)
  • Roboy also needs to grow and move forward. Looking to the future, we want a vision for Roboy to work towards. You will design the look of the next generation Roboy in varying sizes. A cute child-sized Roboy, an agile pre-teen Roboy and a fully adult-sized Roboy. The objective is to come up with an outer-look for Roboy that will contain its skeletal and mechatronics setup and still be an articulating body. The designs have to be truly articulated, meaning all the defined joints should have their desired range of motion. One of these three will be the next Roboy and the designed object will be manufactured as a toy for our future marketing campaigns.

    Skills:

    • Advanced knowledge of CAD software and prototyping materials;
    • Clay and silicone modeling experience