Hardware Services

The Flobi Robot

Flobi is an anthropomorphic robot head that combines a modular design approach with a sensor head. The robot has been designed to use a comic-like human face in order to avoid unwanted reactions as predicted by the uncanny valley hypothesis. Other requirements, like that there should not be any holes in the mask, lead to the development of an innovative magnetic actuation mechanism. The hair part, eyebrows, lips and frontal face part are easily replacable in order to experiment with different character features. The platform features stereo vision, stereo audio and a gyroscope for motion compensation. The cameras are actuated and support at least 400/s saccade speed. Velocity control is provided by custom design motor control boards implementing the necessary control algorithms and real-time support. The complete robot head has 18 degrees-of-freedom.

Flobi Robot

Contact

General Head of CLF
  • PD Dr.-Ing. Sven Wachsmuth
  • Room: 0.408
  • Phone: +49 521-106-2937
  • Mail: swachsmuATtechfak.uni-bielefeld.de
Hardware Requests
  • Simon Schulz
  • Room: 0.411
  • Mail: sschulzATtechfak.uni-bielefeld.de
Software Requests
  • Florian Lier & Simon Schulz
  • Room: 0.409
  • Mail: flierATtechfak.uni-bielefeld.de

Getting Started

In order to experiment with the Flobi Robot (in Simulation or the real Robot) you can easily use our CITK deployment tool chain. For now, we only fully support Ubuntu 16.04 [Xenial]

Note

You will need CREDENTIALS in order to check out the source code. Please contact Florian Lier or Simon Schulz.

Please follow the instructions in the given order

  1. https://toolkit.cit-ec.uni-bielefeld.de/tutorials/bootstrapping
  2. https://toolkit.cit-ec.uni-bielefeld.de/tutorials/installing

In Step 2 please substitute DESIRED_DISTRIBUTION.distribution with flobi-minimal-xenial-0.2.distribution Also, before actually installing (running the buildflow job) have a look at the job-configurator output that will indicate missing OS packages. In case you experience problems, first please have a detailed look at the CITK documentation then ask Florian or Simon directly.

Start The Flobi System

  1. Configure your SSH environment for password-less access (leave the passphrase BLANK (press Enter): https://help.ubuntu.com/community/SSH/OpenSSH/Keys

  2. Switch to the install directory. By default this is

    cd ~/citk/systems/flobi-minimal-xenial-0.2/
    cd bin
    ./vdemo_flobi-minimal.sh
    
  3. You should see VDEMO coming up.

VDEMO
  1. Press “ALL COMPONENTS start”
  2. Enjoy!

Simple API Example (PYTHON)

Create a file flobi_api_example.py with the following content:

# Author: flier@techfak #

# STD
import time
import random
import logging

# FLOBI
from hlrc_client import *


class FlobiAPIExample:

    def __init__(self, _middleware, _scope):
        self.mw = _middleware.strip()
        self.scope = _scope.strip()
        self.robot_controller = RobotController(self.mw, self.scope, logging.INFO)
        # DEBUG
        # self.robot_controller = RobotController(self.mw, self.scope, logging.DEBUG)

    def say_something(self, _input):
        """
            Request the robot to say something using tts
            :param text: text to synthesize
            :param blocking: should this call block during execution?
        """
        self.robot_controller.set_speak(str(_input), True)

    def set_emotion(self, _input, _time):
        """
            Set the current emotion
            :param robot_emotion: a RobotEmotion object
            :param blocking: should this call block during execution?
        """
        re = RobotEmotion()
        # Time in milliseconds
        re.time_ms = int(_time)
        # Available Emotions:
        # RobotEmotion.ANGRY,
        # RobotEmotion.NEUTRAL,
        # RobotEmotion.HAPPY,
        # RobotEmotion.SAD,
        # RobotEmotion.SURPRISED,
        # RobotEmotion.FEAR
        if _input.strip().lower() == "angry":
            re.value = RobotEmotion.ANGRY
            # Blocking = False
            self.robot_controller.set_current_emotion(re, True)

    def set_animation(self, _input, _time, _repeat, _scale):
        """
            Set a head animation
            :param robot_animation: a RobotAnimation object
            :param blocking: should this call block during execution?
        """
        ra = RobotAnimation()
        ra.time_ms = int(_time)
        ra.repetitions = int(_repeat)
        ra.scale = float(_scale)

        if _input.strip().lower() == "nod":
            ra.value = RobotAnimation.HEAD_NOD
            # Available Animations
            # ra.value = RobotAnimation.HEAD_SHAKE
            # ra.value = RobotAnimation.EYEBLINK_L
            # ra.value = RobotAnimation.EYEBLINK_R
            # ra.value = RobotAnimation.EYEBLINK_BOTH
            # ra.value = RobotAnimation.EYEBROWS_RAISE
            # ra.value = RobotAnimation.EYEBROWS_LOWER
            # Blocking = False
            self.robot_controller.set_head_animation(ra, True)

    def gaze_somewhere(self):
        """
            Set a gaze target
            :param robot_gaze: a RobotGaze object
            :param blocking: should this call block during execution?
        """
        g = RobotGaze()
        g.gaze_timestamp = RobotTimestamp()
        g.vergence = 0.0
        g.pan_offset = 0.0
        g.tilt_offset = 0.0
        g.gaze_type = RobotGaze.GAZETARGET_ABSOLUTE
        # Also relative movements are possible
        # g.gaze_type = RobotGaze.GAZETARGET_RELATIVE
        g.pan = random.uniform(-20.0, 20.0)
        g.tilt = random.uniform(-20.0, 20.0)
        g.roll = random.uniform(-20.0, 20.0)
        self.robot_controller.set_gaze_target(g, True)

    def reset(self):
        """
            Reset Robot Config
            :param robot_gaze: a RobotGaze object
            :param blocking: should this call block during execution?
        """
        g = RobotGaze()
        g.gaze_timestamp = RobotTimestamp()
        g.vergence = 0.0
        g.pan_offset = 0.0
        g.tilt_offset = 0.0
        g.tilt = 0.0
        g.pan = 0.0
        g.roll = 0.0
        g.gaze_type = RobotGaze.GAZETARGET_ABSOLUTE
        self.robot_controller.set_gaze_target(g, True)

    def look_at_something(self, _x, _y, _z):
        """
            Sets a gaze at a Cartesian position
            :param x,y,z: position to look at (in eyes frame or frame_id)
            :param blocking: True if this call should block until execution finished on robot
            :param frame_id: in eyes frame = "" or frame_id "something"
            :param roll: side-ways motion of head
        """
        self.robot_controller.set_lookat_target(float(_x), float(_y), float(_z), True, "", 0.0)


if __name__ == '__main__':
    print "Flobi API Example\n"
    fae = FlobiAPIExample("ROS", "/flobi")
    fae.reset()
    print "\n>> Say something"
    fae.say_something("Hello World")
    time.sleep(3)
    fae.reset()
    print ">> Look Angry"
    fae.set_emotion("angry", 1000)
    time.sleep(2)
    fae.reset()
    print ">> Nodding..."
    fae.set_animation("nod", 1000, 1, 1.0)
    time.sleep(2)
    fae.reset()
    print ">> Look at..."
    # Make the robot look at something in front of him
    fae.look_at_something(-0.2, 0.0, 0.2)
    time.sleep(2)
    fae.reset()
    print ">> Absolute Gaze"
    fae.gaze_somewhere()
    time.sleep(2)
    print "\nDone."






Note

Your prefix might be different, the default is $HOME/citk/systems/flobi-minimal-xenial-0.2/

Then do the following (VDEMO System, see above, must be running):

export prefix=/home/fl/citk/systems/flobi-minimal-xenial-0.2/
export PYTHONPATH=$prefix/lib/python2.7/dist-packages/:$prefix/lib/python2.7/site-packages/
source /opt/ros/kinetic/setup.bash
python flobi_api_example.py

Publications

If you plan to publish, please cite one of the following publications.