Exploring the VR app ecosystem
With a really open brief, we started with a deep dive into the VR app ecosystem. We heard how the team had experimented with experiences like mini-golf and Gravity Sketch, but that there were limitations to the utility of these applications.
Within the B2B space, there are a host of virtual meeting apps like MeetinVR, Engage, Glue and Arthur, but the quality of these experiences is variable. The best have good graphics, expressive avatars and allow you to screencast all the 2D applications that you’re already using.
Given that VR meetings won’t be for everyone and there’s a much greater faff factor around joining a session, the incremental benefit of holding collaborative meetings in these spaces could be questioned.
More compelling use cases can be found in apps that allow you to explore, experience and manipulate 3D models – for example architectural models or factory layouts.
Another small, but interesting category of collaborative VR apps can be described as “tools for thought”.
Notable examples are Noda - which allows you to build mind maps and Softspace - which offers curated spaces for ideas, images and web pages.
Both offer really nice interaction. Being ‘inside’ your thinking, within a calm environment really does promote mindful thought - both in terms of focus and serenity.
This felt like something that's often missing from VR experiences and was something we wanted to explore further.
Identifying relevant use cases
The feedback on the client's early VR experiments was clear. For people to adopt the technology into their day-to-day, there needs to be some collaborative purpose. We examined a number of potential use cases, such as documenting the team's development roadmaps, presenting strategies and objectives across business units and visualising the pipeline of requests coming into the team.
We then considered how each stacks up against factors that would make the case for a dedicated virtual environment, with the strongest use case centred around technology roadmaps.
We were conscious that whatever we might create in VR, it wouldn't be the only view on this information. Sometimes, you just need to quickly check a delivery date, without putting on a headset. The client already documented the discussion around their goals and progression towards them using tools like Miro boards. But as good as these tools are, it feels like there’s something missing…
"We become what we behold. We shape our tools, and thereafter our tools shape us."
Could VR elevate these processes?
Could it allows you to stop thinking in terms of stickies and Gantt charts and start thinking in terms of a quest that you’re on together? Whilst the client's team don’t work with 3D models in the way that architects or engineers do, they could meet, explore, understand, discuss, update and amend their plans in a dedicated, data-driven VR space that provides an immersive and quantified view of current initiatives.
The main building blocks of the system could be goals, objectives or the key results that teams are working towards - mapped to the points in time they need to be delivered by.
Tracks or pathways could be used to represent the programmes, projects, milestones and tasks that need to be completed.
In addition to positioning these objects in space (and time) every item could also be interacted with, to define metrics such as; effort, importance, completion percentage, status, velocity or sentiment / morale.
In previous conversations with the client, we discussed the potential for "Kit bashing" - allowing the team to import and combine 3D models to construct spaces.
Whilst we agreed that it is vital that users feel a sense of ownership of the Planscape, building from scratch in VR can be fiddly, time consuming and even anxiety inducing.
Our recommendation was be to provide a system with simple ‘primitives’ that can be added to the space, which can be configured using a simple UI in order to convey rich information.
Starting in 2D, we developed a visual language of elements to represent objectives, milestones and work streams of activity with dependencies and methods for editing and displaying metadata such as level of effort, impact and start/end dates, blockages and delays.
These were then evolved into a 3D design system, initially in Cinema 4D and then generated programmatically within Unity.
Interaction and object manipulation
Originally we had imagined that users would be able to interact with objects directly within the landscape - moving milestones around and 'terraforming' the landscape based on expected effort.
Whilst we had anticipated that a 'palette' UI would be helpful, it wasn't until we started testing builds that we realised how direct manipulation could become physically tiring over extended periods of time.
A more successful approach was to use pointers to select objects, thumb-sticks to manipulate them in space and a flat control panel to fine tune settings.
What we learned
The final proof of concept application delivered on our objective of creating a contemplative space for exploring scenarios, planning activities and reviewing progress. We definitely see potential for immersive information-scapes of this type as an effective way to 'put yourself in the picture'.
Aside from the sheer physicality of VR experiences, we also learnt a lot about the need to control the amount of information you're expected to process at any one time. When you're surrounded by data, it can quickly become overwhelming.
Sometimes a visceral experience of complexity is exactly what's required, but at other times you need to strip things back to make a decision.