For future updates please follow the Smart Products Beacon website – June 2020

Our Products Campaign targeted business sectors that traditionally revolve around different kinds of physical products and explored how these could be transformed through emerging Internet of Things technologies, coupled to human data. We were able to take this forward jointly with the University of Nottingham Smart Products Beacon which has continued the work started in Horizon.

Demonstrator projects continue to make progress with latest updates available on the Smart Products Beacon blogsite.

Smart Products Beacon – “Sensors support machine learning”

Nicholas Watson, Assistant Professor, Faculty of Engineering discusses whether online sensors and machine learning can deliver industry 4.0 to the food and drink manufacturing sector in the Journal of the Institute of Food Science and Technology, vol 33 issue 4 December 2019.

“Manufacturing is experiencing the 4th industrial revolution, which is the use of Industrial Digital Technologies (IDTs) to produce new and existing products. Industrial digital technologies include sensors, robotics, the industrial internet of things (IoT), additive manufacturing, artificial intelligence, virtual and augmented reality, digital twins and cloud computing. At the heart of Industry 4.0 is the enhanced collection and use of data. Industry 4.0 is predicted to have a positive impact of over £450bn to UK manufacturing over the next ten years[1], with benefits such as increased productivity and reduced costs and environmental impacts. But what does this mean for the UK’s largest manufacturing sector, food and drink?”

Link to article (page 20)

University of Nottingham Smart Products Beacon – job opportunity

Research Associat/Fellow (fixed term)

Reference: SC1494719

Closing date: Tuesday 4th February 2020

Job Type: Research

Department: Smart Products Beacon Computer Science

Salary:  £27511 – £40322 per annum (pro rate if applicable) depending on skills and experience (minimum £300943 with relevant PhD). Salary progression beyond this scale is subject to performance

Applications are invited for a Computer Science and/or Engineering based Research Associate/Fellow within The Smart Products Beacon.

The Smart Products Beacon explores how leading edge technologies emerging from Computer Science and Engineering can fundamentally disrupt the nature of products and how they are made. This University led initiative tackles how the combination of physical and digital technologies, from robotically-enabled and additive manufacturing to artificial intelligence and mixed reality, can produce smarter and better products. We also work to ensure that products are produced in responsible ways to embody the fair and transparent use of personal data, operate safely, and respect human values.

The purpose of this role will be to support the Smart Products Beacon in establishing its research agenda by contributing to the creation of an independent research program, linking across a number of disciplines to develop, deploy and study Beacon related projects. A preference will be given to applicants in the following areas, but other skills will be considered if clear evidence of their link to the Beacon can be provided.

  • Software platform development
  • Artificial intelligence
  • Security
  • Development and integration of sensors and interfaces
  • Advanced manufacturing techniques (robotics, additive manufacturing, etc.)
  • User studies

The post holder will be expected to:

  • Create and lead an independent research program
  • Work as part of a multi-disciplinary team to enhance impact
  • Have the flexibility to work on several ongoing projects while developing their own work
  • Contribute to, and lead, high quality publications and proposals

The role holder will have the opportunity to use their initiative and creativity to identify areas for research, develop research methods and extend their research portfolio.

This is a full time, fixed term post for 3 years. Job share arrangements may be considered.

Informal enquiries may be addressed to Professor Steve Benford.  Applications must be submitted online; please note that applications sent by email will not be accepted.

Our University has always been a supportive, inclusive, caring and positive community. We warmly welcome those of different cultures, ethnicities and beliefs – indeed this very diversity is vital to our success, it is fundamental to our values and enriches life on campus. We welcome applications from UK, Europe and from across the globe. For more information on the support we offer our international colleagues, visit; https://www.nottingham.ac.uk/jobs/applyingfromoverseas/index2.aspx

Professor Steve Benford explains the Smart Products Beacon

The Smart products beacon is tackling two big questions. What are smart products? And how are they made?

A smart product is one that uses digital technologies and especially personal data to become more adaptive, personalised and valuable. It captures data throughout its lifetime – through both manufacture and use – and uses this to adapt itself to consumers. In so doing it blends aspects of goods, services and experiences, the three dominant product logics from economics and business into new forms. Sounds a bit abstract? Let’s take an example .…

There was a time when a car was made of rubber and iron. A car is also something you bought and owned. But those days are passing. A modern car is part software, containing an engine management system that can adapt its driving behaviour, and also hosts a variety of other services for navigation and entertainment. Some might say the modern car is really a mobile phone on wheels. For many consumers, a car is now also now a service that they lease rather than a good that they own.

But the transformation doesn’t end there. In a possible future world of autonomous cars, mobility itself may be the service, with consumers summoning vehicles on demand that adapt themselves on the fly to their preferences and history of previous travel. In this world, the physical units become interchangeable and it is the data that matters. You step into a car and it becomes yours by loading your personal profile and adapting itself to you. In those case the car is the data. As Neo learns when he visits the Oracle: “There is no spoon” (only data).

If smart products are made from new materials – personal data – then they are also made in new ways. Digitally native products such as social media are inherently co-created. Consumers either explicitly provide content in the form of the videos and photos they upload directly or implicitly provide it through their records of searches, views and likes. Smart products, even future cars, will be similarly co-created as both manufacturers and consumers engage with digital platforms and data-driven product life-cycles.

This raises a further important question – how can consumers trust future products with their personal data? How can they be sure that products are safe and secure and that they can retain control of their own data?

This vision of co-creating trusted smart products lies at the heart of our beacon. We think that it applies to all manner of products, from high value goods to consumer goods to digital media experiences. We’re looking forward to exploring the possibilities further over the coming years.

My internship on the RoboClean project – Jane Slinger

My internship with the RoboClean team involved developing a custom Alexa skill to control Neato vacuum cleaners by voice. This will enable further development to link with the voice interface if required, as the other aspects of the project involve web systems and multi-agent systems. I also helped run a study to find out how users would interact with the potential system in a lab environment.

I enjoyed the work as it was in an area that interested me and had some challenges in the code to overcome, leading me to learn more about how the systems worked to explore different solutions. It was nice to be able to build on skills about Alexa development learnt in my 3rd year project and include linking to the neato API through HTTP requests and a 3rd party library. This included setting up the Account Linking on the Alexa skill and then adapting some of the code from libraries to work with node.js on the backend instead of front-end JS-based methods that were already in place.

Designing the interactions with the robot and the user was also very interesting as I wanted to make sure that the system would prompt for the necessary information about the robot, and location to clean, without becoming annoying for the user.

The internship will help with my studies and future work as it has given me experience of working with a research team, building on areas I had some experience in as well as expanding to other technical skills that I hadn’t used before, and will be useful in the future.

Written by Jane Slinger

Charlotte Gray shares her experiences of working on RoboClean

I was introduced to the RoboClean project at Horizon whilst interning with the Advanced Data Analysis Centre. The project investigates the ways in which end-users interact with a robot vacuum cleaner and how a robot responds to user utterances; the aim being to inform its effective design and use within food factories.

I was invited to continue my internship for 5 more weeks within Horizon to help with the analysis of data collected through an elicitation study. Overall, this has been a really valuable and rewarding experience. Coming from an academic background in Sociology, I found working closely with researchers specialising in Computer Science exposed me to different research aims and challenges than I had previously encountered. This has been insightful for me as it has not only helped develop new skills in research analysis and interview techniques, but also applied the principles of a range of research methods gained during my academic studies over the past 2-years to cutting edge technological developments.

I have been responsible for transcribing participants’ audio data, analysing visual data, and creating a summary written report of participants’ interview responses. The focus of the report was on the benefits, limitations, and disadvantages experienced by users from the user-robot interactions. The attendance at a range of team meetings has also been beneficial in understanding interactions within a work environment, especially where individuals are working together from across a range of disciplines. Combined with the skills I have learned at workload prioritisation and management, this has made me confident to face future work situations and dilemmas. Additionally, I have written literature reviews on the topic of human-robot interaction. Being able to explore these new topics has also helped me see how issues explored in Sociology are becoming increasingly influenced by the world of technology, for example, how individuals’ day-to-day lives are mediated by the introduction of robots to the workplace. The multidisciplinary projects throughout Horizon have therefore also been interesting to work alongside, clearly showing the benefit of collaborative projects in producing innovative findings.

Contributing to a research project which is aiming for publication in a research journal has been hugely rewarding and exciting, and has made the idea of working in a similar environment after graduating a lot more persuasive.

Written by Charlotte Gray

Smart Products Beacon – Soonchild and Creative Captioning – Tailoring theatre productions for D/deaf audiences

For theatre audiences on a spectrum from D/deaf to hard of hearing, it is often difficult to keep up with performances. Even in cases where the performance is signed, or has captions, these accessibility additions often feel ‘tacked on’ and are typically located out of the action on stage, requiring audiences to share attention between the performance and the support. Working with Red Earth Theatre, a production company with a long history of “Total communication” in which actors sign on stage, we have been developing ways to deliver accessibility right into the heart of a performance.

Red Earth’s new show Soonchild, is touring the UK now, supported by funding from the University of Nottingham Smart Products Beacon, as well as the AHRC and the Arts Council. The show is captioned right across the set with beautiful, designed in, images, video and text delivered using new software developed at the Mixed Reality Laboratory.

The project team developed a software called ‘captionomatic’ which uses the principles of projection mapping to turn whole theatre sets into projection surfaces. While projection mapping itself is by no means a new concept, our approach has been to both simplify the process and to fit it into the wider theatre-tech ecology. Our innovation is to take a 3D model of the set – easily produced from the scale-model of the set pieces typically built for any performance, and project this onto the real set, using a simple system of point-matching to correctly align the physical set with its digital twin. Once that 3D model is in place, we are then able to project images, video, text and whatever else onto those set pieces respecting occlusion and creating an immersive canvas on which to display content.

We provide tools to read in the script from a word document, produce a compete set of captions, then generate the necessary cues which can be fired by QLab (or similar theatrical management software) to drive our system. Theatre designers need only edit the target locations and the look and feel of the text to create beautiful captions around their sets. Different sets of captions can be delivered for different audiences as necessary – so some shows may be fully captioned while others may only have key points highlighted. We know from our research that different audiences have different preferences for how captions are delivered, and our system allows theatre companies to quickly and confidently make adjustments – even between performances of the same show. Setup of the system in a new location takes only a few minutes, something that is absolutely necessary for touring productions.

More broadly, this new approach to projection mapping allows substantial creativity with digital media in theatre that extends beyond accessibility. Critically, it substantially reduces the technical barrier to entry of including projection-mapped media in a show.  Soonchild demonstrates this with some beautiful interactions between live actors and pre-recorded media – in this case shadow puppetry, projected on set as if live.

The software was demonstrated at an accessible theatre technology day at Wolverhampton Arena theatre, and plans for additional workshops and training are in the works. Despite being developed for Soonchild, the software has been designed to be easily applicable to many different types of shows and thus is open source and free, requiring only off the shelf hardware (a PC and projector). We will also be making the hardware used in the show – a projector powerful enough to compete with theatrical lighting – available for other production companies to borrow and experiment with once Soonchild’s tour is complete.

This work was developed in partnership between Red Earth Theatre, The Mixed Reality Laboratory, The School of English and Department of Modern Languages and Cultures.

The project website is available here.

Soonchild will be performed in Nottingham at the Lakeside Arts Theatre in Nottingham on Sunday 24th November – more information can be found here.

AI Technologies for Allergen Detection and Smart Cleaning Food Production Workshop

In collaboration with the AI3 Science Discovery (AI3SD) and Internet of Food Things (IoFT) EPSRC Networks the RoboClean team ran a workshop in London on the 17th of October. The focus of the workshop was to discuss how digital technologies such as AI, sensors and robotics can be used for enhanced allergen detection and factory cleaning within food production environments. The workshop was well attended by a range of stakeholders from industry, academia and organisations such as the Food Standards Agency. The morning of the workshop had three speakers. Nik Watson from the University of Nottingham gave a talk on the future of factory cleaning. This talk covered a range of research projects from the University which developed new digital technologies to monitor and improve factory cleaning processes. The second talk was from AI3SD lead Jeremy Frey from the University of Southampton. Jeremy’s talk gave an introduction to AI and covered a range of new sensors which could be used to detect the presence of allergens in a variety of food products and environments. The final talk was delivered by Martin Peacock from Zimmer and Peacock, a company who develop and manufacture electrochemical sensors. Martin gave an introduction to the company and the technologies they develop before demonstrating how there sensor could be connected to an iPhone and determine the hotness of chilli sauce. Martin’s talk finished by discussing how electro chemical sensors could be used to detect allergens within a factory environment. The afternoon of the workshop focused on group discussions on the following the four topics – all related to allergen detection and cleaning within food production:

  • Data collection, analysis and use
  • Ethical issues
  • Cleaning robots
  • Sensors

Each group had a lead, however delegates moved between tables so they could contribute to more than one discussion. At the end of the workshop the lead from each group reported back with the main discussion points covered by the delegates. The delegates on the ‘robotics’ table reported that robots would play a large role in the future of factory cleaning as they would free up factory operators to spend time on more complicated tasks. The group felt that the design of the robots was essential and discussed that new factories should also be designed differently to facilitate robot cleaning more easily. The group also thought that effective communication with the robot was a key issue which needed further research. The ‘sensors’ group reported that any new sensors used to detect allergens or levels of cleanliness would need to fit into existing regulations and practices, but would be welcomed by the industry, especially if they could detect allergens or bacteria in real-time. The ‘data’ group reported that there was a need for data standards relevant to industrial challenge and there was also a need for open access data to enable the development of suitable analysis and visualisation methods. The ‘ethics’ group discussed numerous key topics including, Bias, Uncertainty, transparency, augmented intelligence and the objectivity of AI.

HALFWAY TO THE FUTURE

The Smart Products Beacon is delighted to be supporting Halfway to the Future – a symposium at the Albert Hall Conference Centre in Nottingham on the 19th & 20th November, exploring the past, present and future of HCI and designed-based research and marking the 20th anniversary of the Mixed Reality Lab at the University of Nottingham.

The symposium will address a range of key themes with dedicated single-track panels, each anchored by prominent keynote speakers reflecting upon one of their influential works in light of subsequent developments and present concerns. This will be followed by presentations of current related research, short future-oriented provocations, and a panel discussion/Q&A. The symposium will also incorporate an exhibition of interactive works and a poster session.

Symposium programme

I CUBE

I-CUBE is developing new methods to enable collaborative robots (co-bots) to learn in a more naturalistic manner, using sensors to interpret the actions, language and expressions of their human collaborators. Advanced algorithms for decision-making, combined with reinforcement learning techniques will enable more effective, productivity enhancing human-robot cooperation for shared tasks.  

Our first demonstrator project will show how a small industrial co-bot (a Universal Robots UR5) can be directed to learn how to sort laundry in preparation for washing, according to the human collaborators’ preferences, as given by natural language and gesture. Computer vision and machine learning techniques will be integrated within the demonstrator for gesture recognition, as well as recognition of the colour of the cloths and the baskets in which to place the items of clothing.  

We are currently preparing for our first study with the intention of capturing the language and gestures that humans use whilst directing a co-bot to sort laundry. To do this we will use a Wizard of Oz method where a human will fulfil the role of the co-bot ‘brain’ whilst being hidden from the participant. This will allow participants to express themselves naturally while the co-bot enacts their instructions correctly, or not. Errors in the co-bot’s responses are expected to elicit natural corrective reactions from the human. These natural language and gestures will provide a corpus for the co-bot to use in its learning as well as assist in improving the co-bots sense of its environment, objects in it and their relevance to i