ROBOY CORE
THE STATE
OF ROBOY’S
TECHNOLOGY

roboy core

the almost-current state of roboy’s core technology

Roboy’s tech stack is fast-evolving. This page reflects snapshots of mostly complete states to give you an idea, where we were a few months back – so you know what we are working on. If you’re interested in working with us, please reach out to collaborate@roboy.org and we’ll happily give you access to our internal documentations, the CAD repository, etc. For Code, we’re live on Github – just stop by.

If you need fast support, stop by on Telegram: https://t.me/roboytech

MECHATRONICS

Using generative design and 3d printing, mechatronics creates the bodies of our robots. 

Front Shell

  • Stereolithography
  • ZED stereo camera
  • Back-projected facial features

Back Shell

  • Laser projector
  • 2-mirror optics system
  • Odroid UX3 (face/voice output, camera input)
  • 2-way, onboard stereo speakers built into ears

Neck Portion

  • 3 degrees of freedom, maintenance-free ball-in-socket joint
  • 4 MyoBrick motor muscles (100 N)
  • Trapezoidal tendon arrangement
  • On-board FPGA motor control
  • Joint sensors
Head front shell
blank
2-mirror optics system inside the head
blank
Installed head with back-projected facial features.
  • Housing of 18 Motor Units (500N each)
  • Modular structure for easy mounting and assembling
  • Matrix Creator houses 8 microphones for listening to people and lots of LEDs.
  • 3D-printed, monolithic chest plate
  • organic shaped bones
  • Bosch-profile ribs
Assembled torso (side view).
Assembled torso (front view).
Torso (back view)
  • Human-like design
  • 20 servo motors installed on the forearm
  • DOF(degree of freedom):
    20 DOF for the hand
    2 DOF for the wrist 
    2 DOF for the elbow
    3 DOF for the shoulder
  • The tendon routing supports the precision of the fingers and provides stability. 
Assembled Hand
Hand forearm system

For more information visit our repository.  

  • Lightweight inspired by PaBiRoboy.
  • Sufficient lever arms to actuate the joints.
  • Pelvis and legs are being designed to work for tricycling.
  • Extrusion-profile as bones allow for quick adaption of bone-lengths to fit on to the bike.
  • Tricycling requires a 1 DOF (degree of freedom) joint at the hip, knee, and ankle. Overall, 6 motors to control one leg. 
  • There are 2 motors above the hip joint to control flexion/extension of the thighs, 2 on each thigh for flexion/extension of the lower leg and 2 on the lower leg for flexion/extension of the foot. 
Roboy on the tricycle.
Roboy´s legs

Spine

  • Mass about 3 kg
  • Supports the upper body’s weight; provides posture while allowing for movement and flexibility.
  • Requirements-based modeling: changing loads updates the CAD model automatically.
  • Robust design for easy adaption.

Hip

  • Weight about 0.8 kg 
  • The hip joint bears Roboy’s whole body weight and all the forces
    (Roboy’s entire body weight is about 80 kg).
  • Generated with Autodesk Generative Design.
Roboy’s spine.
Roboy’s hip.

MyoBrick (100-300N)

  • When tension is applied to the tendon, the whole motor turns, tightening the spring.
  • Three bearings on winch and motor:
    • 2 in the front, absorbing the forces on the winch
    • 1 in the back to restrict radial movement of the motor
  • Rotation of motor is limited through the spring, that connects the motor to the housing.
  • Two different sensors:
    • Encoder in motor measures angle of motor-axis relative to the motor
    • Magnetic angle sensor measures the absolute angle of the winch

MyoMuscle (500N)

  • When tension is applied to the tendon, the spring gets compressed by a wire rope hoist.
  • Two bearings on winch and motor, both in the front, absorbing the forces on the winch
  • Spring deflection is limited by the nonlinearity of the wire rope hoist.
  • Two different sensors:
    • Encoder in motor measures the absolute angle of the winch
    • Linear magnetic sensor measures the deflection of the spring
Muscle 500 N
Muscle 100 N

MIDDLEWARE

FPGA based low-level control with a PID at 2500 Hz of almost 50 motors – accessible from ROS. Also, reverse engineering Vive tracking and more.

For Roboy 2.0 we decided to continue using the MyoRobotics motor boards but wanted to get rid of the FlexRay and ganglions and use a DE-10-Nano SOC instead, due to:

  • availability (FlexRay and ganglion are custom made parts ordered from Bristol University, which takes a couple of weeks from order to product), the FPGA can be bought off the shelves
  • costs (the FPGA is cheap, like 20% the price of the previous FlexRay/ganglion setup)
  • versatility: The Intel Cyclone V SE5CSEBA6U23I7 chip combines a 800 MHz Dual-core Advanced RISC Machine (ARM) Cortex-A9 with a FPGA (with 110000 logical elements (LEs), 120 digital signal processing blocks (DSPs)). This combination allows very flexible system designs.
DE0-Nano Development Board (image reference)

For more information visit our repositories:

Further information:

Lighthouse Tracking is a technology that allows to precisely track sensor positions and movements in 3D space in real time. It was developed by Valve and is currently used in HTC Vive. Two lighthouses emit light pulses that are received by sensors in order to calculate the position in space. We disassembled the Vive controllers and built our own sensors to receive the light pulses sent by the two lighthouses. Sensor measurements are processed on the FPGA board, transferred to the ARM core and made accessible via ROS.

  • Lighthouse tracking really accurate (sub-mm).
  • It is cheap (~250 € for lighthouses; few € per sensor).
  • We use FPGAs to decode the signal.
  • General purpose indoor position tracking which will
    make Roboy balance and walk.

Used Hardware

  • Two HTC Vive lighthouses.
  • Custom infrared sensors.
  • FPGA board with integrated ARM core.
Lighthouse tracking sensors
Architecture of the Lighthouse tracking. Overview of the data flow.
DE0-Nano Development Board (image reference)
Lighthouse optical model
The video above shows PaBiRoboy as it is tracked with our custom lighthouse tracking sensors.  We implemented a joint angle controller for the legs. So you can actually control each joint angle, to make them dance for example. You can track what ever you like in 3d space. Prerequisite is to use one of our de10-nano-soc fpgas. You can track up to 32 sensors with one fpga.

All our development is documented on hackaday.

  • The skin is based on planar flexible waveguides.
  • Pressure reduces the amount of light received on the other end.
  • We can reconstruct the position of the pressure.

For more information visit our repository.

Control

Building on top of collaborations with different chairs, enabling joint-space control of musculoskeletal robots. Also spiking and classical neural networks to tame the muscles.

CASPR is a opensource simulation tool, written in MatLab. It is developed for research purposes in the area of tendon-driven robots, so-called CDPRs (Cable-driven parallel robots). Basically the simulation software is able to perform analysis in the following fields of study:

  • Dynamics and Control
  • Forward Dynamics
  • Inverse Dynamics
  • Motion Control
  • Kinematics (Forward Kinematics, Inverse Kinematics)
  • Workspace Analysis
  • Design Optimisation

CASPR provides a GUI, which allows easy and intuitive access to the to main functions. In general, one starts by choosing a robotic model. These models are simplifications of the actual robots one is interested in. Robot models in CASPR are basically build with three different primitives

  • Links
  • Joints
  • Cables

where links and cables are straight lines with predefined start and end positions in space and joints provide the connection between the links. 

The difference between CASPR and CASPROS is basically that CASPROS works completely on C++ and is accessible by it also does not provide a GUI and outputs not cable lengths, but motor commands to control real robots. So CASPROS is the linkage we need to control Roboy, while CASPR offers us a tool for simulation and research purposes, e.g. validating our robot models.

CASPR GUI main window: CASPR models are always build of three .xml files. One file for the body description (links, joints, center of masses), one for the cables (attachment points) and another file describing a joint trajectory. You can see a primitive robot model, a simple arm with just one joint. It is visualized in the 3D coordinate system. The black lines are the links, black circles the joints, blue circles center of masses and the red lines are the cables / tendons. 

For more information visit our repository

and have a look at our documentation.

  • You can directly control every motor. The motors have three control modes:  position, velocity, force. The control modes refer to the motor position and velocity, while force mode is achieved via displacement sensors on the muscle’s spring. 
  • Each FPGA controls up to 14 motors. The communication with the FPGA works via ROS messages/services. We have written rqt plugins for convenient control/visualization.

We have written rqt plugins for convenient control/visualization. You can see them in action in the video above. Each fpga controls up to 14 motors. The communication with the fpga works via ROS messages/services. 

Muscle unit with motor board

For more information visit our repository

COGNITION

A robot without a brain is just a body! Cognition makes roboy fun, interesting and likeable. Also where most Deep neural networks live.

  • Natural language processing
  • Deep learning for unknown situations
  • Flexible state machine with different personalities
  • Ability to remember facts
  • Awareness of environment
The overview diagram shows the external systems which Dialog System interacts with and the tasks for which the system is responsible.

For more information visit our repository and have a look at our documentation.

  • Based on SEMPRE library.
  • Constructs a logical form for the further inference.
  • Represents the input sentences as a semantic triple of Subject, Object, and Predicate.
  • Includes a sentiment analyzer functionality.
  • Parses the data towards the Neo4j Ontology and sentence semantics.
  • Uses DBpedia and Microsoft Knowledge Graph for entities retrieval.
Semantic parser components

For more information visit our repository and have a look at our documentation.

  • Knowledge Graph for efficient representations.
  • Neo4j graph database allows fast operations on graphs.
  • Java client provides the DB driver to manipulate the data.
  • Custom JSON based protocol for easy DB queries.
  • Used to store the learned information about the environment.
  • Memory is updated during every conversation with a person.
  • Currently consists of nearly 500 data points.

For more information visit our repository and have a look at our documentation.

  • Using Bing Speech API for speech recognition
  • Using Cerevoice SDK for speech synthesis
  • Using Matrix Creator in order to localize the sound source
Matrix Creator: Multifunctional sensor board

For more information visit our repository and have a look at our documentation.

  • Object recognition using YOLO
  • Speaker detection using facial landmarks from DLIB
  • Face embeddings using FaceNet
  • Multitracking
  • ROS interface
Architecture of the current Vision System.

For more information visit our repository and have a look at our documentation.

GROWTH

What would you do, if you had a robot? Correct! Play with it. VR, AR, Simulation & Games based on and with Roboy!

  • Digital twin in a virtual environment
  • Interaction through HTC Vive
  • Powered by Unity
  • Gazebo simulation
  • ROS communication
  • Playful experience
Roboy in a virtual environment.
Pointing Device to interact with Roboy and the UI (user interface).
Hand tool to move Roboy around.

For more information visit our documentation.

  • Roboy shows gestures, is able to recognize and react during the game.
  • Games based on features like text-to-speech, face display and movement.
2D Puzzle Game, in which the player controls a small ship piloted by Roboy which flies around levels which represent petri dishes containing cell cultures. In these levels the player interacts with the environment as well as friendly, neutral and enemy cells and robotic enemies.
Picture of the jump’n run puzzle game.