I CUBE

I-CUBE is developing new methods to enable collaborative robots (co-bots) to learn in a more naturalistic manner, using sensors to interpret the actions, language and expressions of their human collaborators. Advanced algorithms for decision-making, combined with reinforcement learning techniques will enable more effective, productivity enhancing human-robot cooperation for shared tasks.  

Our first demonstrator project will show how a small industrial co-bot (a Universal Robots UR5) can be directed to learn how to sort laundry in preparation for washing, according to the human collaborators’ preferences, as given by natural language and gesture. Computer vision and machine learning techniques will be integrated within the demonstrator for gesture recognition, as well as recognition of the colour of the cloths and the baskets in which to place the items of clothing.  

We are currently preparing for our first study with the intention of capturing the language and gestures that humans use whilst directing a co-bot to sort laundry. To do this we will use a Wizard of Oz method where a human will fulfil the role of the co-bot ‘brain’ whilst being hidden from the participant. This will allow participants to express themselves naturally while the co-bot enacts their instructions correctly, or not. Errors in the co-bot’s responses are expected to elicit natural corrective reactions from the human. These natural language and gestures will provide a corpus for the co-bot to use in its learning as well as assist in improving the co-bots sense of its environment, objects in it and their relevance to i

Industrial Cobots Understanding Behaviour (iCUBE)

Collaborative robots, or co-bots, are robots that work collaboratively with humans in a productivity – enhancing process, most often associated with manufacturing and/or healthcare domains. Despite the aim to collaborate, co-bots lack the ability to sense humans and their behaviour appropriately. Instead robots rely on physically mechanical, hierarchical, instructions given explicitly by the human instead of utilizing a more natural means to include pose, expression, and language, to use this to determine behaviour. In turn, humans do not understand how the robot makes its decisions.

Co-bots also do not utilise human behaviour to learn tasks implicitly, and advances in online and reinforcement learning are needed to enable this. In iCUBE we will address these issues by endowing industrial co-bots with the ability to sense and interpret the actions, language, and expressions of the person with whom they collaborate. We will show the new capabilities of such co-bots with a demonstrator to instruct robots how to sort laundry. A human will teach the co-bot to sort laundry by showing it what to do, telling it what to do, or by simply indicating their satisfaction with the co-bot’s action through their expression.

iCUBE is currently integrating the BlueMax human sensing component that will allow the co-bot to sense the user’s facial expressions and body gestures. Soon we will add to this Natural Language Processing (NLP) tools so that people can speak with the co-bots. Such tools will include Automatic Speech Recognition (ASR), dialogue management, and Text To Speech (TTS).

iCUBE is a Smart Products Beacon-funded project, bringing together researchers from the School of Computer Science, the Institute of Advanced Manufacturing and Human Factors Research Group of the Faculty of Engineering, Horizon Research Institute, and BlueSkeye AI Ltd.

Key people involved:

  • Michel Valstar, School of Computer Science
  • David Branson, Institute of Advanced Manufacturing
  • Sue Cobb, Human Factors Research
  • Mercedes Torres Torres, School of Computer Science/Horizon
  • Adam Zhou, School of Computer Science
  • Pepita Stringer, Horizon
  • Dominic Price, Horizon
  • Timur Almaev, BlueSkeye AI

The main laundry sorting study will happen after summer, starting in September, with results expected in March 2020. A series of demonstrator activities will be held in 2020 where people will be able to interact with the co-bot.