Innerspace – an emotion analytics platform, delivering insights for an XR world
Just as web analytics helped businesses optimise engagement based on page views and clicks, emotion analytics promises to provide new insights for the era of augmented reality and the spatial web.
Parallel partnered with Emteq Labs, one of the leading firms in this space to help bring to life their vision for more democratised tools for understanding emotional responses.
Emotion analytics is an emerging field that uses data from multiple sensors within head mounted displays, combined with cutting-edge machine learning techniques, to gain an understanding of people's behaviour and emotional responses.
By pairing biometric sensors with VR, Emteq Labs can measure emotional responses in conditions that simulate real-world environments. Using sensors within head mounted displays, they are able to monitor things like heart rate, facial electromyography, skin conductance and eye movements, to determine someone’s level of arousal - how activated or excited they are - and their valence - whether their reaction is positive, negative or neutral.
Visualising this complex, multi-dimensional data - and producing actionable insights from it - presents many challenges, meaning that it's normally only undertaken by specialised research teams. Traditional visualisation and business intelligence tools are not equipped to reveal relationships between where someone is, what they're experiencing and how it makes them feel.
For Emteq Labs, making their technology more accessible presents an opportunity to expand into new sectors, such as content creation, retail, marketing and even the real-time personalisation of the digital tools used in high-stress working environments like control rooms, operating theatres or trading desks.
"The ambition is to be able to recommend or actuate changes to environments – either during the design process, or at runtime within digital environments – in order to make it more likely that people will be in a desired emotional state.”
To deliver on this vision, we needed to fuse multiple datasets together into a single interface, which would allow people to navigate, make sense of and manipulate this high dimensional information.
Given that there were so many dimensions in play, it was clear that a single visualisation would not provide the clarity needed - or the ability to filter data.
Our approach was to design a number of interactive and interlinked visualisation components, that provided information, but also acted as filtering mechanisms - allowing you to playback and cross-tabulate data to reveal previously hidden patterns - or to explore insights generated automatically by the system.
The resulting platform - Innerspace - provides a highly visual approach to analysing emotion data, exposing data and insights pertaining to personality traits, behaviour and interactions, emotional & stress response to stimuli and perceived cognitive load.
Working with real-world data
In May 2019, Emteq Labs partnered with the University of Bournemouth to deliver the world’s largest biometric data collation study, gathering data from over 780 individuals.
Participants could explore and interact with an immersive environment, sounds and objects. Environments were designed and controlled to deliver a different stimuli: negative, neutral and positive.
The study provided us with a rich data resource, providing us with a foundation for our design work. For each participant, we were able to examine:
– Background personality traits (survey responses)
– Where in the immersive space they were
– Which stimuli they encountered
– What they saw on their headsets
– Their valence and arousal levels, moment-by-moment
Mapping emotion over time
One of the key visualisations that we needed to resolve, was a way to show the shifting patterns of participants’ arousal and valence scores as they evolved over time - or in response to specific stimuli.
Using Kernel Density Estimation plots rather than traditional scatter plots provided an intuitive map of ‘Emotional space’ that worked equally well for a single person or groups of many people.
Mapping emotion to spaces
Another design challenge was mapping the emotional responses of an individual - as well as the aggregated responses of many people - to the immersive space they experienced.
Because affect scores are made up of two dimensions (arousal / valence) mapping them into physical space is not always intuitive.
The final designs assign values to grid squares for multiple people, with height denoting arousal and colour valence. The individuals levels change continuously as they moved through the space.
To bring precision to the reading of these visualisations, the user should be able to view them alongside the KDE plot.
Mapping segments of interest
This ternary plot is used to provide at-a-glance context around segments of participants that have either been manually created by the user - or generated automatically.
It helps you understand the what makes the people within a segment similar - is it just characteristics, just their overall affect scores, just their response to something specific - or a mix of these?
For example, the user might create a segment based on “Women under 30” but not yet know that there was a strong similarity in this group’s response to a specific stimuli (which the plot would reveal).
An analogy might be a plot that showed groups of people who share similar tastes in things - for example music, wine and films. You could have a set of people who share no similarity at all (who we would discount as a segment).
Equally, you might find people who love jazz and hate red wine, but have no similarity in terms of the type of films they like.
We parse the data, looking at each dimension one by one, trying to find similar people.
If these are detected, the dimension in question becomes the primary one. We then look at those similar individuals across all the other dimensions to see if they exhibit any other similarities.
This example segment (bottom left, below) would be 50% based on characteristics (neuroticism) and 50% based on stimuli (valence in response to butterflies).
Emotion analytics and the spatial web
Over the last thirty years, communication technologies have evolved from the fixed web (PCs), through the mobile web (smartphones) and are at the start of the spatial web (VR/ AR).
In his deep-dive into the spatial web, strategist and thought leader Matthew Ball [@ballmatthew] explains that a nascent metaverse is already here if you know where to look. Manifestations of the metaverse are most obvious in fun experiences such as Fortnite, Pokemon Go and Snapchat lenses. However improvements in user experiences across entertainment, design, training and healthcare all require an ability to visualise this new category of behavioural data.
We are excited to hear how data visualisation of the spatial web might help your business. If you have a project you'd like to discuss, please contact