Touch, Taste, Smell Technology

Everyday life is increasingly transformed through technological advancements such as novel multisensory devices and interfaces. Such multisensory technologies not just stimulate our eyes (think of screens) and ears (and audio systems), but also consider how and what we touch, smell, and taste, among others, in our lives. Technology enables the creation of multisensory experiences that enrich and augment the way we interact with the worlds around us. We are pursuing various projects related to the stimulation design, perception, and experience of multisensory interfaces.

Designing smell-based experiences through advances in hardware and software

Olfactory (smell-based) interactions and interfaces are extremely relevant in a variety of use and application contexts, from entertainment to rehabilitation, supporting learning experiences to enhancing the creation of immersive experiences such as in virtual and augmented environments. Olfactory stimulation in interaction design may not only augment users' experiences with technology and create novel interactive experiences (e.g., navigating in virtual environments), but also amplify sensory perception for people with sensorial impairments (e.g., blindness and deafness). Moreover, olfactory stimulation can provide creative solutions for conveying information through smell and facilitate decision making processes in situations that are suffering from visual overload (e.g., driving a car). In this project, Dr Emanuela Maggioni is combining knowledge on olfactory perception with a systematic analysis of the technical parameters of scent-delivery and control.

Tackling the loss of smell and its impact on our personal health and wellbeing

Smell disorders can detrimentally impact a person's wellbeing. For example, when someone loses their sense of smell it can mean that food loses its flavour or their social interactions are impaired. This is especially relevant among an ageing population. Current technological solutions to measure and train people's sense of smell are not standardised or accurate, and are based on manual approaches such as soaked pens and scratch-and-sniff saturated materials. These manual approaches have shortcomings including a lack of control over smell stimuli, consistency of the smell delivery parameters such as intensity and concentration and a lack of comparable digital performance records to determine changes over time. This project is led by Dr Emanuela Maggioni at OWidgets with support by Prof Marianna Obrist with strong links to the UCL Instituate of Healthcare Engineering, where Prof Obrist is Deputy Director for Digital Health.

Enabling the study of multisensory integration and effects on user experiences

We typically perceive the world though multiple senses which the brain decides whether to integrate or segregate. Integrating information across the senses is key to perception and action, and influences a wide range of behavioral outcomes, including detection, localization, and more broadly, reaction times. Many studies on multisensory integration have typically relied on audio-visual interactions because, among others, the technology to deliver audio-visual stimuli is relatively well-established and widely available (e.g., screens, headphones). New emerging multisensory technologies enable new ways to stimulate, replicate, and control sensory signals (touch, taste, and smell) and thus can expand the possibilities for multisensory integration research. For example, with regards to smell and taste stimulation, we see growing development efforts to create more flexible and portable solutions that vary in their capabilities compared to established laboratory equipment such as gustatometers and olfactometers. Importantly, emerging olfactory displays and smell-delivery technologies are becoming smaller, wearable, and more modular, allowing less invasive stimulation within and outside laboratory settings. Similarly, we can see novel gustatory stimulation approaches emerging, such as taste levitation systems that exploit principles of acoustics for delivering precisely controllable taste stimuli to the users' tongue. Dr Patricia Cornelio is working on this research project pushing the boundaries of current audio-visual research on multisensory integration. Patricia is also pioneering research on the sense of agency in this new emerging multisensory interaction space.

Exploring the future of food and computing

In most species, the sense of taste is the key to identify and distinguish the potential nutritious and harmful types of food, leading to its acceptance (or rejection). Tastes are encoded by taste receptor cells on the tongue detecting chemicals corresponding to the different taste qualities: sweet, sour, salty, bitter, umami (or savoury). The sense of taste has been extensively studied to understand how the taste system works, such as its anatomy, pathway, and neural code. However, there is still a lack of understanding on how taste can be exploited in Human-Computer Interaction. This is partly because the HCI field has been heavily relying on vision and hearing, with emerging attention on touch. Here we are not only interested to contribute to the growing field of multisensory human-food interaction design. We are developing novel gustatory interfaces to open up new design spaces for enjoyable, healthy eating experiences on Earth and beyond. Given the increasing possibilities of short- and long-term space travel to the Moon and Mars, it is essential not only to design nutritious foods but also to make eating an enjoyable experience.

Exploring the effect of multisensory (e.g. smell) stimulation on our body image

A negative body perception or misconception can cause an elevated risk of eating disorders, isolation and mental disease. Such distortions of one's own body image impact people's way of interacting with other people and with the environment, negatively impacting one's emotional state. The multisensory influences on different kinds of body perception have been investigated since the early 1990s. Of all the human senses, vision, touch, and proprioception are the ones most studied with respect to their impact on body image, with audition recently joining the effort. However, an important sense that has been largely unexplored from a body image perspective is that of smell, which is surprising given the ubiquity of smell (i.e., scents surround us even if we are not aware of them all the time). PhD Giada Brianza at the University of Sussex, is supervised by Prof Marianna Obrist, and is currently working on exploring how multisensory stimuli, especially olfactory stimuli, influence the perception we have of our own body and the feeling towards the body, expanding previous knowledge on crossmodal correspondences. The expected outcome of her project will not only advance the integration of multisensory stimuli in relation with the concept of body image, but it will also focus on expanding the knowledge gained so far around crossmodal correspondences and the role of smell in designing possible future applications, such as wearable device and user experiences.