I-CUBE is developing new methods to enable collaborative robots (co-bots) to learn in a more naturalistic manner, using sensors to interpret the actions, language and expressions of their human collaborators. Advanced algorithms for decision-making, combined with reinforcement learning techniques will enable more effective, productivity enhancing human-robot cooperation for shared tasks.  

Our first demonstrator project will show how a small industrial co-bot (a Universal Robots UR5) can be directed to learn how to sort laundry in preparation for washing, according to the human collaborators’ preferences, as given by natural language and gesture. Computer vision and machine learning techniques will be integrated within the demonstrator for gesture recognition, as well as recognition of the colour of the cloths and the baskets in which to place the items of clothing.  

We are currently preparing for our first study with the intention of capturing the language and gestures that humans use whilst directing a co-bot to sort laundry. To do this we will use a Wizard of Oz method where a human will fulfil the role of the co-bot ‘brain’ whilst being hidden from the participant. This will allow participants to express themselves naturally while the co-bot enacts their instructions correctly, or not. Errors in the co-bot’s responses are expected to elicit natural corrective reactions from the human. These natural language and gestures will provide a corpus for the co-bot to use in its learning as well as assist in improving the co-bots sense of its environment, objects in it and their relevance to i

Food Design for Future Dining (FD2)

Food in the digital era is radically transforming. The supply chain is reshaping, with distribution networks evolving and on-line retailers providing a significant and ever-growing part of the market. At the same time, consumers spend longer online, impacting their food preferences and consumption practices. How we leverage digital technology to deliver next generation food systems capable of delivering sustainable, healthy and safe foods remains an open question. A significant part of the work in using digital technologies around foods is currently focusing on smart precision-agriculture. But less well explored is the significant potential digital technologies have to radically reconfigure food supply chains, and the way consumers interact with foods.

Food Design for Future Dinging, or FD2, is exploring how digital technologies can be used to enhance the food consumption experience, by demonstrating prototypical hybrid foods – food stuffs that are created to provide a novel physical and digital eating experience, and that are enhanced by the inclusion of relevant provenance information. As we unpack the design space, we’re seeing that there is great potential, and complexity, in how food and data speak to a wide range of consumer values.

The core team brings together expertise in Mixed Reality and Human-Computer Interaction from Computer Science (Martin Flintham, Dimitri Darzentas, Emily Thorn), Food Process Engineering from Engineering (Serafim Bakalis), and Food Legislation and Compliance from Law (Richard Hyde). We’re also working with Blanch and Shock, a London based catering and design SME who are delivering cutting-edge culinary expertise.

We have four activities underway, broadly aligned with different elements of food consumption with the consumer in mind.

Enhancing the consumer experience with digital footprints. The French app https://yuka.io/en/ is making waves by using data to alert consumers as to whether products are good or should be avoided. Our first demonstrator is a digitally augmented cake gift, that uses augmented reality to provide two kinds of provenance to enhance the cake consumption experience. Functional or utilitarian information such as nutritional or allergen data, or what we might think of as hard provenance is, as with Yuka, presented by an app. We are also exploring soft provenance; rich narrative data such as stories about ingredients, how the cake was made and decorated and by whom, that speak to a broader set of values. Moving forward we’ve got our sights set on chocolate.

Enhancing product development. We’re building on some work we began with Qian Yang in UoN Sensory Sciences, which is looking to enhance the validity of consumer testing methodologies in the lab. By increasing the contextual validity of a lab study, we can reduce the failure rate of new products. Here we turn again to new immersive technologies to change the consumption experience but also allow naturalistic food consumption. Using Augmented Virtuality we’re taking consumer panels out of the lab into a variety of virtual environments to see how they can improve validity, or ultimately provide a radical new dining experience.

Enhancing food-as-a-service. Here we’re considering how food can be manufactured to be more relevant, more personalised or more value sensitive in the first place. We’ve finished designing a set of food development ideation cards that articulate not just flavour and physical properties, but also values, scenarios and contexts. The concepts that they embody are forming the basis for a technology probe into customised meal preparation, combined with a variety of non-soy miso recipes created by Blanch and Shock.

Finally, we are building a community of UoN academics in the broad area of “Smart Foods” and identifying key external partners to collaborate with. We will utilise existing UoN investment, e.g. through the Beacons to create a critical mass that would enhance collaboration and enable us to respond to future funding opportunities. In the immediate term the team presented a poster at the Connected Everything conference in June, and have also been demoing the work to various industry partners. In September, Serafim Bakalis spoke at ICEF 13, the International Congress of Food Engineering, making the proposition for a consumer focus on digital in the food domain.

AI3SD & IoFT AI Technologies for Allergen Detection and Smart Cleaning

This event is brought to you by the AI3SD (Artificial Intelligence and Augmented Intelligence for Automated Investigations for Scientific Discovery) and the IoFT (Internet of Food things) Networks.

As food allergies and intolerances are on the rise, allergen detection and awareness is becoming more critical than ever at all stages of the food production pipeline; from cleaning the factories and kitchens the food is produced in, to detecting allergens in food, right through to creating allergen free food in the future. Unsurprisingly research has turned to technological solutions to combat this issue. This workshop is centered around the usage of Artificial Intelligence in Allergen Detection and Smart Cleaning within Food Production; research areas that co-align between both AI3SD & IoFT. The workshop will begin with some thought provoking talks to report on the current state of affairs, and consider where we need to be going in the future. There are six main working group topics identified for this workshop, and talks will be given on the different aspects that need to be considered with respect to allergen detection and smart cleaning before we break into the working groups for more formal discussions. There are multiple sessions for the working group discussions, and so there will be opportunities to take part in as many group discussions as you wish. The workshop will be formally recorded and the suggestions for going forward will be captured in a position paper. Lunch will be provided and the workshop will end with networking drinks.

Programme

The programme for the day is as follows:

  • 10:00-10:30: Registration & Coffee
  • 10:30-10:45: Welcome from Professor Jeremy Frey & Professor Simon Pearson
  • 10:45-11:15: Smart Cleaning & Robots in Factories – Dr Nicholas Watson
  • 11:15-11:45: Speaker TBC
  • 11:45-12:15: TBC – Professor Jeremy Frey
  • 12:15-1300: Lunch
  • 13:00-13:15: Speaker TBC
  • 13:15-13:30: AI in Allergen Detection – Steve Brewer
  • 13:30-14:30: Working Group Discussions
  • 14:30-14:45: Coffee Break
  • 14:45-15:30: Working Group Discussions
  • 15:30-16:00: Working Groups Report Back, Decide on Next Steps
  • 16:00-17:00: Networking Drinks

Register here

Email – info@ai3sd.org
Twitter – @AISciNet
LinkedIn – https://www.linkedin.com/in/ai3sd
LinkedIn Interest Group – AI3 Science Network Interest Group