RoboClean update 11/6/2019

The RoboClean project is investigating the work of cleaning factory floors, and the potential for robotic cleaners to work alongside—and with—human operators to ensure factories meet the strict industry hygiene guidance. These robots will use the latest sensors to also detect the presence of food allergens, allowing factory managers to avoid cross-contamination of products, especially in batch-driven processes.

The project will deliver and evaluate an interactive connected platform to enable novel human-robot collaboration and IoT smart sensor data collection in food factories. See our prior blog post for more information about the project. In this post we would like to present an update of our progress.

We are engaging with local SMEs and multinational food manufacturers to understand more about the sorts of environments we envisage these technologies will be deployed. Through interviews, workshops, and factory visits we intend to explicate the requirements and challenges—both legal and socio-technical—for deploying robots to complex environments such as factories. These visits are now on-going and the outcomes of these will inform the project’s design work. This work is being led by Martin Porcheron in Computer Science.

Roberto Santos, from the University of Nottingham Digital Research Service (DRS), has joined the project and is collaborating with Carolina Fuentes from the Horizon Digital Economy Research Institute on the development of our demonstrator robot platform. This platform, when complete, will support the autonomous and manual management of robot teams as well as individual robots. We are also currently in the process of developing a number of elicitation studies to understand the language and sorts of commands factory workers would use to direct and coordinate robots. Our focus at this stage is to deliver a platform suitable to control one robot at a time, and this is already taking shape with elicitation studies supporting this development process. Brian Logan from the Agents Lab in Computer Science is working with the team to ensure the platform design is suited to our multi-agent collaboration goals that will be delivered in later stages of the project.

Ahmed Rady from the Faculty of Engineering has also recently joined the project and is developing the processes for the smart sensors to detect various allergens, including collecting data that will be vital for the detection of these allergens. One of the biggest challenges facing manufacturers is the cross contamination of allergens within the manufacturing environment, and cleaning is a critical step in preventing this. By deploying sensors with the robots, we will be able to detect and potentially prevent any food safety events before product leaves the factory.

Overall, the team is already working towards developing deliverables and is looking forward to a successful 2019.

Finally, the team will be presenting a poster at the ConnectedEverything 2019 conference in June, where we will be on hand to discuss the project’s objectives, approach, outcomes, and potential collaborations. We think this is a great opportunity to connect with potential partners in the manufacturing industry and look forward to seeing you there.

Written by Martin Porcheron

 

Halfway to the Future – A symposium in Nottingham, UK from 19th-20th November 2019

The Halfway to the Future symposium is a two-day event in the city of Nottingham, UK exploring the past, present, and future of HCI and design-based research. The symposium will take place on the 19th and 20th November 2019 at the Albert Hall Conference Centre.

The symposium will address a range of key themes with dedicated single-track panels, each anchored by prominent keynote speakers reflecting upon one of their influential works in light of subsequent developments and present concerns. This will be followed by presentations of current related research, short future-oriented provocations, and a panel discussion/Q&A. The symposium will also incorporate an exhibition of interactive works and a poster session. All papers will be peer reviewed under a double-blind process and some papers will be selected for panels while others will be invited to present their work in poster format. Call for papers is now open.

Take a look at the symposium Agenda.

If you would like to keep up to date with the symposium, register for updates here.

If you have any questions, please don’t hesitate to contact the organising committee. We are currently putting together an exciting programme of talks and demos, with all keynote speakers confirmed. We look forward to your submissions!

We would like to thank the University of Nottingham Faculty of Science and ACM SIGCHI for generously sponsoring the symposium.

Twitter: @httfsymposium 

University of Nottingham Smart Products Beacon – job opportunity

The Smart Products Beacon of Excellence is tackling a new challenge facing the world – how can we harness the digital revolution in a responsible way, for the good of all?  More information about the Beacon can be found here.

The Smart Products Beacon is currently seeking applications for a Research Development Manager.  The successful applicant will be a key member of the Beacon team and contribute to achieving a significant increase in the Beacon’s research funding portfolio, in line with the University’s Research Strategy. The role will manage a portfolio of research development applications and relationships for the Smart Products Beacon and particular focus will be applications to UKRI as well as appropriate EU funding sources.  Closing date for applications is Wednesday 8th May 2019.  More information about the post and how to apply can be found here.

Follow the Smart Products Beacon on Twitter @ProductsBeacon 

 

 

Smart Products Beacon Gathering

It’s been a busy year for the Smart Products Beacon during which we’ve refined our research agenda and vision, launched demonstrator projects, developed our business case for the University and secured funding for initial projects, including EPSRC funding to establish a Centre for Doctoral Training for at least 65 new PhD students.

Today (8th April 2019), we are holding a one day gathering to learn more and explore how people can get involved.

The event commenced with an overview of RoboClean, Food Design for Future Dining (FD)2 and Industrial Co-bots Understanding Behaviour (ICUBE).

A breakout session led to some useful research-led discussion on emerging themes, including:

  • Process planning for highly-customised products
  • Social technological interactions and implications
  • Data-enabled smart optimization
  • Digital Technology, Manufacturing, and Productivity
  • Smart Musical Instruments

A session of contributed paper presentations followed  a short lunch break and poster session.  These included:

  • User-Experience Design for Future Smart Vehicles
  • Managing attractiveness and tensions in digitally enhanced business environments
  • Locating the Smart Product Beacon: understanding the place based-agenda in RCUK funding (or, why economic geography matters)
  • “Demonstrating a framework to investigate combined packing and scheduling problems”?
  • “Peeling away the layers: toward a metaphor of foam to analyse composed digital-physical products”
  • Physical-Digital Alignment

The day concludes with a second breakout session – an opportunity to address and plan key beacon activities for the coming year.

Industrial Cobots Understanding Behaviour (iCUBE)

Collaborative robots, or co-bots, are robots that work collaboratively with humans in a productivity – enhancing process, most often associated with manufacturing and/or healthcare domains. Despite the aim to collaborate, co-bots lack the ability to sense humans and their behaviour appropriately. Instead robots rely on physically mechanical, hierarchical, instructions given explicitly by the human instead of utilizing a more natural means to include pose, expression, and language, to use this to determine behaviour. In turn, humans do not understand how the robot makes its decisions.

Co-bots also do not utilise human behaviour to learn tasks implicitly, and advances in online and reinforcement learning are needed to enable this. In iCUBE we will address these issues by endowing industrial co-bots with the ability to sense and interpret the actions, language, and expressions of the person with whom they collaborate. We will show the new capabilities of such co-bots with a demonstrator to instruct robots how to sort laundry. A human will teach the co-bot to sort laundry by showing it what to do, telling it what to do, or by simply indicating their satisfaction with the co-bot’s action through their expression.

iCUBE is currently integrating the BlueMax human sensing component that will allow the co-bot to sense the user’s facial expressions and body gestures. Soon we will add to this Natural Language Processing (NLP) tools so that people can speak with the co-bots. Such tools will include Automatic Speech Recognition (ASR), dialogue management, and Text To Speech (TTS).

iCUBE is a Smart Products Beacon-funded project, bringing together researchers from the School of Computer Science, the Institute of Advanced Manufacturing and Human Factors Research Group of the Faculty of Engineering, Horizon Research Institute, and BlueSkeye AI Ltd.

Key people involved:

  • Michel Valstar, School of Computer Science
  • David Branson, Institute of Advanced Manufacturing
  • Sue Cobb, Human Factors Research
  • Mercedes Torres Torres, School of Computer Science/Horizon
  • Adam Zhou, School of Computer Science
  • Pepita Stringer, Horizon
  • Dominic Price, Horizon
  • Timur Almaev, BlueSkeye AI

The main laundry sorting study will happen after summer, starting in September, with results expected in March 2020. A series of demonstrator activities will be held in 2020 where people will be able to interact with the co-bot.

RoboClean: Human Robot Collaboration for Allergen-Aware Factory Cleaning

Human-Robot collaboration is expected to increase in the following years [Elprama, 2017]. The RoboClean project will investigate the potential of human-robot collaboration, integrated with IoT sensors for cleaning and allergen detection on a factory floor. We will be exploring, the impact of introducing a robot that can teamwork “hand to hand”, or better said  “hand to robo-hand” with a factory worker – focusing on assisting them rather than replacing their role.

RoboClean targets the food industry – supporting the cleaning and safety process in an industry where the presence of an allergen is a safety risk, and also given the relevance of this sectors annual contribution of £28 billion to the economy.

Early stages of the project involved a visit to a bread factory in Birmingham to learn about current work practices in food factory cleaning, and gain a better understand the social context of food factories. In addition, we built and evaluated a “Human-Robot Collaborative-Hub” – effectively the “brain”, combining a voice interface and a robot cleaner. This “hub” will store data on robot activity, to identify zones to be cleaned through voice interface control.  This activity enabled us to identify what functions and procedures (API) were required to control the robot.

The next stage of the project will involve designing and developing the architecture of the “Human-Robot Collaborative-Hub” as a bridge between cleaner robots’ and different user interfaces – to explore controlling robot interaction in different areas through voice interfaces, for example:  “Robot, clean around the round table” or “Robot, clean around the fridge”. We are also working on integrating a sensor capable to detect allergens, with the aim of directing the robot cleaner to specific locations using data from the “HR-Collaborative-Hub”.

Introducing our Demonstrator projects

We have adopted three demonstrator projects to kick of the Campaign.

Project are part funded by the University of Nottingham Smart Products Beacon of Excellence

RoboClean: Human Robot Collaboration for Allergen-Aware Factory Cleaning 

In food and drink manufacturing, a third of working time is spent cleaning, which significantly affects productivity and efficiency.  This project aims to understand and address the industry need for cleaning support technologies by developing and deploying human robot collaboration to assist in the cleaning of factories and detect the unwanted presence of allergens to prevent food safety events. 

 Food Design for Future Dining (FD)2

Traditionally “food design” has been an area of expertise for Chefs, where raw materials are combined and cooked or processed, resulting in the blending of multiple components to create a “dish”.  However, the food engineering involved in current culinary processes resembles processes taking place at universities, with extensive testing of exotic ingredients requiring  control  and  high levels of  precision. Food is also a highly regulated commodity where, in order to bring a food to market, regulatory requirements must be met and businesses must be able to support any claims made by reference to evidence.  This project will address a number of questions through creative technologies and  engineering research – designing and demonstrating prototypical digital foods that provide novel physical and digital eating experiences. 

Industrial Co-bots Understanding Behaviour (I-CUBE) 

Collaborative robots, or co-bots, are robots that work collaboratively with humans in a productivity- enhancing process – most often associated with manufacturing and/or healthcare domains. Despite the aim to collaborate, co-bots lack the ability to sense humans and their behaviour appropriately. Instead robots rely on physically mechanical, hierarchical, instructions given explicitly by the human instead of utilizing a more natural means to include pose, expression, and language, and utilize this to determine behaviour. In turn, humans do not understand how the robot makes its decisions.  This project will enable research in human-robot corporation and bring together the ‘know-how’ and ‘research interests’ of human factors work, automatic human behaviour analysis and machine learning. Based on a laundry sorting co-bot task, the project will investigate how humans and robots explain their tasks, cope with mistakes and guide each other to overcome (impending) failure.