Professor Steve Benford explains the Smart Products Beacon

The Smart products beacon is tackling two big questions. What are smart products? And how are they made?

A smart product is one that uses digital technologies and especially personal data to become more adaptive, personalised and valuable. It captures data throughout its lifetime – through both manufacture and use – and uses this to adapt itself to consumers. In so doing it blends aspects of goods, services and experiences, the three dominant product logics from economics and business into new forms. Sounds a bit abstract? Let’s take an example .…

There was a time when a car was made of rubber and iron. A car is also something you bought and owned. But those days are passing. A modern car is part software, containing an engine management system that can adapt its driving behaviour, and also hosts a variety of other services for navigation and entertainment. Some might say the modern car is really a mobile phone on wheels. For many consumers, a car is now also now a service that they lease rather than a good that they own.

But the transformation doesn’t end there. In a possible future world of autonomous cars, mobility itself may be the service, with consumers summoning vehicles on demand that adapt themselves on the fly to their preferences and history of previous travel. In this world, the physical units become interchangeable and it is the data that matters. You step into a car and it becomes yours by loading your personal profile and adapting itself to you. In those case the car is the data. As Neo learns when he visits the Oracle: “There is no spoon” (only data).

If smart products are made from new materials – personal data – then they are also made in new ways. Digitally native products such as social media are inherently co-created. Consumers either explicitly provide content in the form of the videos and photos they upload directly or implicitly provide it through their records of searches, views and likes. Smart products, even future cars, will be similarly co-created as both manufacturers and consumers engage with digital platforms and data-driven product life-cycles.

This raises a further important question – how can consumers trust future products with their personal data? How can they be sure that products are safe and secure and that they can retain control of their own data?

This vision of co-creating trusted smart products lies at the heart of our beacon. We think that it applies to all manner of products, from high value goods to consumer goods to digital media experiences. We’re looking forward to exploring the possibilities further over the coming years.

Smart Products Beacon – Soonchild and Creative Captioning – Tailoring theatre productions for D/deaf audiences

For theatre audiences on a spectrum from D/deaf to hard of hearing, it is often difficult to keep up with performances. Even in cases where the performance is signed, or has captions, these accessibility additions often feel ‘tacked on’ and are typically located out of the action on stage, requiring audiences to share attention between the performance and the support. Working with Red Earth Theatre, a production company with a long history of “Total communication” in which actors sign on stage, we have been developing ways to deliver accessibility right into the heart of a performance.

Red Earth’s new show Soonchild, is touring the UK now, supported by funding from the University of Nottingham Smart Products Beacon, as well as the AHRC and the Arts Council. The show is captioned right across the set with beautiful, designed in, images, video and text delivered using new software developed at the Mixed Reality Laboratory.

The project team developed a software called ‘captionomatic’ which uses the principles of projection mapping to turn whole theatre sets into projection surfaces. While projection mapping itself is by no means a new concept, our approach has been to both simplify the process and to fit it into the wider theatre-tech ecology. Our innovation is to take a 3D model of the set – easily produced from the scale-model of the set pieces typically built for any performance, and project this onto the real set, using a simple system of point-matching to correctly align the physical set with its digital twin. Once that 3D model is in place, we are then able to project images, video, text and whatever else onto those set pieces respecting occlusion and creating an immersive canvas on which to display content.

We provide tools to read in the script from a word document, produce a compete set of captions, then generate the necessary cues which can be fired by QLab (or similar theatrical management software) to drive our system. Theatre designers need only edit the target locations and the look and feel of the text to create beautiful captions around their sets. Different sets of captions can be delivered for different audiences as necessary – so some shows may be fully captioned while others may only have key points highlighted. We know from our research that different audiences have different preferences for how captions are delivered, and our system allows theatre companies to quickly and confidently make adjustments – even between performances of the same show. Setup of the system in a new location takes only a few minutes, something that is absolutely necessary for touring productions.

More broadly, this new approach to projection mapping allows substantial creativity with digital media in theatre that extends beyond accessibility. Critically, it substantially reduces the technical barrier to entry of including projection-mapped media in a show.  Soonchild demonstrates this with some beautiful interactions between live actors and pre-recorded media – in this case shadow puppetry, projected on set as if live.

The software was demonstrated at an accessible theatre technology day at Wolverhampton Arena theatre, and plans for additional workshops and training are in the works. Despite being developed for Soonchild, the software has been designed to be easily applicable to many different types of shows and thus is open source and free, requiring only off the shelf hardware (a PC and projector). We will also be making the hardware used in the show – a projector powerful enough to compete with theatrical lighting – available for other production companies to borrow and experiment with once Soonchild’s tour is complete.

This work was developed in partnership between Red Earth Theatre, The Mixed Reality Laboratory, The School of English and Department of Modern Languages and Cultures.

The project website is available here.

Soonchild will be performed in Nottingham at the Lakeside Arts Theatre in Nottingham on Sunday 24th November – more information can be found here.

Smart Products Beacon Gathering

It’s been a busy year for the Smart Products Beacon during which we’ve refined our research agenda and vision, launched demonstrator projects, developed our business case for the University and secured funding for initial projects, including EPSRC funding to establish a Centre for Doctoral Training for at least 65 new PhD students.

Today (8th April 2019), we are holding a one day gathering to learn more and explore how people can get involved.

The event commenced with an overview of RoboClean, Food Design for Future Dining (FD)2 and Industrial Co-bots Understanding Behaviour (ICUBE).

A breakout session led to some useful research-led discussion on emerging themes, including:

  • Process planning for highly-customised products
  • Social technological interactions and implications
  • Data-enabled smart optimization
  • Digital Technology, Manufacturing, and Productivity
  • Smart Musical Instruments

A session of contributed paper presentations followed  a short lunch break and poster session.  These included:

  • User-Experience Design for Future Smart Vehicles
  • Managing attractiveness and tensions in digitally enhanced business environments
  • Locating the Smart Product Beacon: understanding the place based-agenda in RCUK funding (or, why economic geography matters)
  • “Demonstrating a framework to investigate combined packing and scheduling problems”?
  • “Peeling away the layers: toward a metaphor of foam to analyse composed digital-physical products”
  • Physical-Digital Alignment

The day concludes with a second breakout session – an opportunity to address and plan key beacon activities for the coming year.