header

Roboy Core

The current state of Roboy’s technology

Mechatronics

Using generative design and 3d printing, mechatronics creates the bodies of our robots. 


 

Middleware

FPGA based low-level control with a PID at 2500 Hz of almost 50 motors – accessible from ROS. Also, reverse engineering Vive tracking and more.

Control

Building on top of collaborations with different chairs, enabling joint-space control of musculoskeletal robots. Also spiking and classical neural networks to tame the muscles.

Cognition

A robot without a brain is just a body! Cognition makes roboy fun, interesting and likeable. Also where most Deep neural networks live.

Growth

What would you do, if you had a robot? Correct! Play with it. VR, AR, Simulation & Games based on and with Roboy!

Social

Building robots come with responsibility. That’s why we have Lucy, Roboy’s best friend.

 

Mechatronics

a robot needs a body

Head_DesignPic

Head

Roboy express his emotions through his projected face: smiles, being shy or surprised, sometimes sad – but always cute!

  • Front Shell

    • Stereolithography
    • ZED stereo camera
    • Back-projected facial features

    Back Shell

    • Laser projector
    • 2-mirror optics system
    • Odroid UX3 (face/voice output, camera input)
    • 2-way, onboard stereo speakers built into ears

    Neck Portion

    • 3 degrees of freedom, maintenance-free ball-in-socket joint
    • 4 MyoBrick motor muscles (100 N)
    • Trapezoidal tendon arrangement
    • On-board FPGA motor control
    • Joint sensors
  • CAD model of the head


  • Head front shell


    2-mirror optics system inside the head


    Installed head with back-projected facial features.

Torso_DesignPic
Torso

Torso

The chest and upper back combine to form the strong, protective rib cage where all the motor units, tendons and electronics for the arms are implemented.

    • Housing of 18 Motor Units (500N each)
    • Modular structure for easy mounting and assembling
    • Matrix Creator houses 8 microphones for listening to people and lots of LEDs.
    • 3D-printed, monolithic chest plate
    • organic shaped bones
    • Bosch-profile ribs
  • CAD model of the torso


  • Assembled torso (side view).


    Assembled torso (front view).


    Torso (back view)

Hand_DesignPics

Hands

Roboy´s tendon driven anthropomorphic hand is closely mimics the human hand. The hand is perfect for signs: pointing, waving, fist-bump – or even mimicking Captain Spock.

    • Human-like design
    • 20 servo motors installed on the forearm
    • DOF(degree of freedom):
      20 DOF for the hand
      2 DOF for the wrist 
      2 DOF for the elbow
      3 DOF for the shoulder
    • The tendon routing supports the precision of the fingers and provides stability. 
  • CAD model of the hand


  • Assembled Hand


    Hand forearm system

  • For more information visit our  GitHub_Mark repository.

Legs_DesignPics
Legs

Legs

Do you ever saw a humanoid robot riding on a bike? Roboy´s pelvis and legs are made to ride a tricycle.

    • Lightweight inspired by PaBiRoboy.
    • Sufficient lever arms to actuate the joints.
    • Pelvis and legs are being designed to work for tricycling.
    • Extrusion-profile as bones allow for quick adaption of bone-lengths to fit on to the bike.
    • Tricycling requires a 1 DOF (degree of freedom) joint at the hip, knee, and ankle. Overall, 6 motors to control one leg. 
    • There are 2 motors above the hip joint to control flexion/extension of the thighs, 2 on each thigh for flexion/extension of the lower leg and 2 on the lower leg for flexion/extension of the foot. 
  • CAD model of the legs on the tricycle


  • Roboy on the tricycle.


    Roboy´s legs

  • For more information visit our GitHub_Mark repositories:

Hip_DesignPic
Spine & Hip

Spine & Hip

The spine supports Roboy´s upper body´s weight and allows movement and flexibility, while the generatively designed hip bears the whole body weight and all the forces.

  • Spine

    • Mass about 3 kg

    • Supports the upper body’s weight; provides posture while allowing for movement and flexibility.

    • Requirements-based modeling: changing loads updates the CAD model automatically.
    • Robust design for easy adaption.

    Hip

    • Weight about 0.8 kg 
    • The hip joint bears Roboy’s whole body weight and all the forces
      (Roboy’s entire body weight is about 80 kg).
    • Generated with Autodesk Generative Design.
  • CAD model of the spine

    CAD model of the hip


  • Roboy’s spine.


    Roboy’s hip.

Muscle
Muscles

Muscles

The muscular system is responsible for the movement of Roboy´s body. Each of the muscles mimics a muscle in the human body, with all its complexity: pull only, tendon wrapping, hysteresis. 

  • MyoBrick (100-300N)

    • When tension is applied to the tendon, the whole motor turns, tightening the spring.
    • Three bearings on winch and motor:
      • 2 in the front, absorbing the forces on the winch
      • 1 in the back to restrict radial movement of the motor
    • Rotation of motor is limited through the spring, that connects the motor to the housing.
    • Two different sensors:
      • Encoder in motor measures angle of motor-axis relative to the motor
      • Magnetic angle sensor measures the absolute angle of the winch

    MyoMuscle (500N)

    • When tension is applied to the tendon, the spring gets compressed by a wire rope hoist.
    • Two bearings on winch and motor, both in the front, absorbing the forces on the winch
    • Spring deflection is limited by the nonlinearity of the wire rope hoist.
    • Two different sensors:
      • Encoder in motor measures the absolute angle of the winch
      • Linear magnetic sensor measures the deflection of the spring
  • 300 N muscle:

    100 N muscle:

    M12 muscle:


  • Muscle 500 N


    Muscle 100 N

Middleware

the nervous system

FPGA 1
FPGA Control

FPGA Control

Each part of Roboy has its own FPGA to control the motors and gather all the sensor data. They are connected to the control network via ROS.


  • For Roboy 2.0 we decided to continue using the MyoRobotics motor boards but wanted to get rid of the FlexRay and ganglions and use a DE-10-Nano SOC instead, due to:

    • availability (FlexRay and ganglion are custom made parts ordered from Bristol University, which takes a couple of weeks from order to product), the FPGA can be bought off the shelves
    • costs (the FPGA is cheap, like 20% the price of the previous FlexRay/ganglion setup)
    • versatility: The Intel Cyclone V SE5CSEBA6U23I7 chip combines a 800 MHz Dual-core Advanced RISC Machine (ARM) Cortex-A9 with a FPGA (with 110000 logical elements (LEs), 120 digital signal processing blocks (DSPs)). This combination allows very flexible system designs.

  • DE0-Nano Development Board (image reference)

  • For more information visit our  GitHub_Mark repositories:

    Further information:

Motorboard2
Motor Board

Motor Board

We developed our own custom, super small but powerful motor boards. They allow us to control our motors in position, speed and most importantly force mode. Since we made some space on the PCB, we also added LEDs, IMUs and even microphones.

    • We drive our motors at 24 V
    • Controlled from the FPGAs, communications is handled through an SPI bus

  • Motorboard including the case

Middleware2_DesignPics
Lighthouse Tracking

Lighthouse Tracking

We reverse-engineered the HTC Vive tracking system and developed our own sensors in order to track the absolute position of our robot in space. An order of magnitude cheaper than commercial systems and self-calibrating!

  • Lighthouse Tracking is a technology that allows to precisely track sensor positions and movements in 3D space in real time. It was developed by Valve and is currently used in HTC Vive. Two lighthouses emit light pulses that are received by sensors in order to calculate the position in space. We disassembled the Vive controllers and built our own sensors to receive the light pulses sent by the two lighthouses. Sensor measurements are processed on the FPGA board, transferred to the ARM core and made accessible via ROS.

    • Lighthouse tracking really accurate (sub-mm).
    • It is cheap (~250 € for lighthouses; few € per sensor).
    • We use FPGAs to decode the signal.
    • General purpose indoor position tracking which will
      make Roboy balance and walk.

    Used Hardware

    • Two HTC Vive lighthouses.
    • Custom infrared sensors.
    • FPGA board with integrated ARM core.

  • Lighthouse tracking sensors


    Architecture of the Lighthouse tracking. Overview of the data flow.


    DE0-Nano Development Board (image reference)


    Lighthouse optical model

    The video above shows PaBiRoboy as it is tracked with our custom lighthouse tracking sensors.  We implemented a joint angle controller for the legs. So you can actually control each joint angle, to make them dance for example. You can track what ever you like in 3d space. Prerequisite is to use one of our de10-nano-soc fpgas. You can track up to 32 sensors with one fpga.

  • All our development is documented on hackaday.

Skin
Soft Skin

Soft Skin

Skin is the largest organ of the human body. So far, Roboy had to live without it – but not much longer. Our silicone skin allows it to sense pressure while at the same time protecting the electronics.

    • The skin is based on planar flexible waveguides.
    • Pressure reduces the amount of light received on the other end.
    • We can reconstruct the position of the pressure.
  • For more information visit our  GitHub_Mark repository.

Control

making roboy move

CASPROS 1

CASPR / CASPROS

CASPR an open source platform designed for research of CDPR (Cable-driven parallel robots) using MATLAB. Combine it with ROS, you get CASPROS and the right tool to control our robots.

  • CASPR is a opensource simulation tool, written in MatLab. It is developed for research purposes in the area of tendon-driven robots, so-called CDPRs (Cable-driven parallel robots). Basically the simulation software is able to perform analysis in the following fields of study:

    • Dynamics and Control
    • Forward Dynamics
    • Inverse Dynamics
    • Motion Control
    • Kinematics (Forward Kinematics, Inverse Kinematics)
    • Workspace Analysis
    • Design Optimisation

    CASPR provides a GUI, which allows easy and intuitive access to the to main functions. In general, one starts by choosing a robotic model. These models are simplifications of the actual robots one is interested in. Robot models in CASPR are basically build with three different primitives

    • Links
    • Joints
    • Cables

    where links and cables are straight lines with predefined start and end positions in space and joints provide the connection between the links. 

    The difference between CASPR and CASPROS is basically that CASPROS works completely on C++ and is accessible by it also does not provide a GUI and outputs not cable lengths, but motor commands to control real robots. So CASPROS is the linkage we need to control Roboy, while CASPR offers us a tool for simulation and research purposes, e.g. validating our robot models.


  • CASPR GUI main window: CASPR models are always build of three .xml files. One file for the body description (links, joints, center of masses), one for the cables (attachment points) and another file describing a joint trajectory. You can see a primitive robot model, a simple arm with just one joint. It is visualized in the 3D coordinate system. The black lines are the links, black circles the joints, blue circles center of masses and the red lines are the cables / tendons. 

  • For more information visit our  GitHub_Mark repository

    and have a look at our documentation.

MuscleControl

Hill Muscle Control

In order to make our models behave as close as possible to the human body, we apply models that make our motor units behave like Hill muscles. And we use EMG to find the right parameters for the model. 

    • You can directly control every motor. The motors have three control modes:  position, velocity, force. The control modes refer to the motor position and velocity, while force mode is achieved via displacement sensors on the muscle’s spring. 
    • Each FPGA controls up to 14 motors. The communication with the FPGA works via ROS messages/services. We have written rqt plugins for convenient control/visualization.
  • We have written rqt plugins for convenient control/visualization. You can see them in action in the video above. Each fpga controls up to 14 motors. The communication with the fpga works via ROS messages/services. 


    Muscle unit with motor board

  • For more information visit our  GitHub_Mark repository.

COGNITION

making roboy smart

Dialog3
Dialog System

Dialog System

To say and what to say, that is the question! A 3-layer approach to smart answers with rules, memory and neural-networks.


    • Natural language processing
    • Deep learning for unknown situations
    • Flexible state machine with different personalities
    • Ability to remember facts
    • Awareness of environment

  • The overview diagram shows the external systems which Dialog System interacts with and the tasks for which the system is responsible.

  • For more information visit our  GitHub_Mark repository

    and have a look at our documentation.

Parser

Semantic Parser

Roboy Parser is responsible for constructing formal representation of user input, spoken language in particular. This representation is then used by logic and inference modules of the dialog that in their turn generate an action to answer, such as an emotion, a body movement or an utterance.

    • Based on SEMPRE library.
    • Constructs a logical form for the further inference.
    • Represents the input sentences as a semantic triple of Subject, Object, and Predicate.
    • Includes a sentiment analyzer functionality.
    • Parses the data towards the Neo4j Ontology and sentence semantics.
    • Uses DBpedia and Microsoft Knowledge Graph for entities retrieval.

  • Semantic parser components

  • For more information visit our  GitHub_Mark repository

    and have a look at our documentation.

Cognition_DesignPics
Memory

Memory

Roboy Memory consists of a Neo4J graph database and a Java client to serve it. Roboy stores each entity (a person or an organization, for instance) as a node in the graph and builds up connections between nodes as he collects more information from the conversations with people.

    • Knowledge Graph for efficient representations.
    • Neo4j graph database allows fast operations on graphs.
    • Java client provides the DB driver to manipulate the data.
    • Custom JSON based protocol for easy DB queries.
    • Used to store the learned information about the environment.
    • Memory is updated during every conversation with a person.
    • Currently consists of nearly 500 data points.
  • For more information visit our  GitHub_Mark repository

    and have a look at our documentation.

Audio_DesignPic
Audio

Audio

Auditory data plays a crucial role in human interaction with the environment. Therefore, we are developing an audio subsystem for Roboy that enables him to understand and produce human speech, as well as process other noises coming from the surrounding.

    • using Bing Speech API for speech recognition
    • using Cerevoice SDK for speech synthesis
    • using Matrix Creator in order to localize the sound source

  • Matrix Creator: Multifunctional sensor board

  • For more information visit our GitHub_Mark repository

    and have a look at our documentation.

Vision

Vision

The goal of Roboy Vision project is to provide Roboy with extensive vision capabilities. This means to recognize, localize and classify objects in the environment as well as to provide data for localization to be processed by other modules.

GROWTH

playing with roboy 

VRoboy_DesignPic
VRoboy

VRoboy

RoboyVR is a virtual reality experience in which the user can watch,
but also interact with Roboy while he performs specific tasks.

    • Digital twin in a virtual environment
    • Interaction through HTC Vive
    • Powered by Unity
    • Gazebo simulation
    • ROS communication
    • Playful experience

  • Roboy in a virtual environment.


    Pointing Device to interact with Roboy and the UI (user interface).


    Hand tool to move Roboy around.

  • For more information visit our documentation.

Game_DesignPic

Games

Take Roboy and build a game around him. Use the actual Roboy, his features or any aspect of his story and out comes… fun & magic.


    • Roboy shows gestures, is able to recognize and react during the game.
    • Games based on features like text-to-speech, face display and movement.
  • 2D Puzzle Game, in which the player controls a small ship piloted by Roboy which flies around levels which represent petri dishes containing cell cultures. In these levels the player interacts with the environment as well as friendly, neutral and enemy cells and robotic enemies.



    Picture of the jump’n run puzzle game.