Collaborative robots, or co-bots, are robots that work collaboratively with humans in a productivity – enhancing process, most often associated with manufacturing and/or healthcare domains. Despite the aim to collaborate, co-bots lack the ability to sense humans and their behaviour appropriately. Instead robots rely on physically mechanical, hierarchical, instructions given explicitly by the human instead of utilizing a more natural means to include pose, expression, and language, to use this to determine behaviour. In turn, humans do not understand how the robot makes its decisions.
Co-bots also do not utilise human behaviour to learn tasks implicitly, and advances in online and reinforcement learning are needed to enable this. In iCUBE we will address these issues by endowing industrial co-bots with the ability to sense and interpret the actions, language, and expressions of the person with whom they collaborate. We will show the new capabilities of such co-bots with a demonstrator to instruct robots how to sort laundry. A human will teach the co-bot to sort laundry by showing it what to do, telling it what to do, or by simply indicating their satisfaction with the co-bot’s action through their expression.
iCUBE is currently integrating the BlueMax human sensing component that will allow the co-bot to sense the user’s facial expressions and body gestures. Soon we will add to this Natural Language Processing (NLP) tools so that people can speak with the co-bots. Such tools will include Automatic Speech Recognition (ASR), dialogue management, and Text To Speech (TTS).
iCUBE is a Smart Products Beacon-funded project, bringing together researchers from the School of Computer Science, the Institute of Advanced Manufacturing and Human Factors Research Group of the Faculty of Engineering, Horizon Research Institute, and BlueSkeye AI Ltd.
Key people involved:
- Michel Valstar, School of Computer Science
- David Branson, Institute of Advanced Manufacturing
- Sue Cobb, Human Factors Research
- Mercedes Torres Torres, School of Computer Science/Horizon
- Adam Zhou, School of Computer Science
- Pepita Stringer, Horizon
- Dominic Price, Horizon
- Timur Almaev, BlueSkeye AI
The main laundry sorting study will happen after summer, starting in September, with results expected in March 2020. A series of demonstrator activities will be held in 2020 where people will be able to interact with the co-bot.
Human-Robot collaboration is expected to increase in the following years [Elprama, 2017]. The RoboClean project will investigate the potential of human-robot collaboration, integrated with IoT sensors for cleaning and allergen detection on a factory floor. We will be exploring, the impact of introducing a robot that can teamwork “hand to hand”, or better said “hand to robo-hand” with a factory worker – focusing on assisting them rather than replacing their role.
RoboClean targets the food industry – supporting the cleaning and safety process in an industry where the presence of an allergen is a safety risk, and also given the relevance of this sectors annual contribution of £28 billion to the economy.
Early stages of the project involved a visit to a bread factory in Birmingham to learn about current work practices in food factory cleaning, and gain a better understand the social context of food factories. In addition, we built and evaluated a “Human-Robot Collaborative-Hub” – effectively the “brain”, combining a voice interface and a robot cleaner. This “hub” will store data on robot activity, to identify zones to be cleaned through voice interface control. This activity enabled us to identify what functions and procedures (API) were required to control the robot.
The next stage of the project will involve designing and developing the architecture of the “Human-Robot Collaborative-Hub” as a bridge between cleaner robots’ and different user interfaces – to explore controlling robot interaction in different areas through voice interfaces, for example: “Robot, clean around the round table” or “Robot, clean around the fridge”. We are also working on integrating a sensor capable to detect allergens, with the aim of directing the robot cleaner to specific locations using data from the “HR-Collaborative-Hub”.
An overview slide of RoboClean
We have adopted three demonstrator projects to kick of the Campaign.
Project are part funded by the University of Nottingham Smart Products Beacon of Excellence
RoboClean: Human Robot Collaboration for Allergen-Aware Factory Cleaning
In food and drink manufacturing, a third of working time is spent cleaning, which significantly affects productivity and efficiency. This project aims to understand and address the industry need for cleaning support technologies by developing and deploying human robot collaboration to assist in the cleaning of factories and detect the unwanted presence of allergens to prevent food safety events.
Food Design for Future Dining (FD)2
Traditionally “food design” has been an area of expertise for Chefs, where raw materials are combined and cooked or processed, resulting in the blending of multiple components to create a “dish”. However, the food engineering involved in current culinary processes resembles processes taking place at universities, with extensive testing of exotic ingredients requiring control and high levels of precision. Food is also a highly regulated commodity where, in order to bring a food to market, regulatory requirements must be met and businesses must be able to support any claims made by reference to evidence. This project will address a number of questions through creative technologies and engineering research – designing and demonstrating prototypical digital foods that provide novel physical and digital eating experiences.
Industrial Co-bots Understanding Behaviour (I-CUBE)
Collaborative robots, or co-bots, are robots that work collaboratively with humans in a productivity- enhancing process – most often associated with manufacturing and/or healthcare domains. Despite the aim to collaborate, co-bots lack the ability to sense humans and their behaviour appropriately. Instead robots rely on physically mechanical, hierarchical, instructions given explicitly by the human instead of utilizing a more natural means to include pose, expression, and language, and utilize this to determine behaviour. In turn, humans do not understand how the robot makes its decisions. This project will enable research in human-robot corporation and bring together the ‘know-how’ and ‘research interests’ of human factors work, automatic human behaviour analysis and machine learning. Based on a laundry sorting co-bot task, the project will investigate how humans and robots explain their tasks, cope with mistakes and guide each other to overcome (impending) failure.