I recently attended a conference on immersive technology in healthcare at Exeter University and came away buzzing with ideas and insight.
I’m a product designer. Over the past 10 years I’ve worked on products for funeral tech, computer vision, analytics and, currently, an aviation weather services app.
Where I don’t have much experience is in (a) virtual reality (VR) or (b) healthcare. Attending this conference was very much about stepping out of my comfort zone and expanding my knowledge of a sector I’d love to get involved with.
I love designing user interfaces but I’m very aware that technology continues to evolve. It’s clear there’s strong momentum for immersive tech like VR with both Meta and Apple now having launched their own hardware.
My expectations ahead of the event were that we’d be shown extended reality (XR) experiences to demonstrate the potential of the technology and get people excited about new opportunities.
To an extent, this did happen – but with the focus on healthcare the speakers were not interested in hype or pitching aspirational concepts.
They gave practical examples of what they had been doing and the measurable impacts it had – both positive and negative.
Everyone was very clear about how progress was being carefully monitored and that technologies are being rolled out only when there is clear evidence it improves patient care.
All the talks were by people working within healthcare to an audience mostly made up of healthcare professionals. My colleague Murray and I were exceptions to this.
That meant there was relatively little focus on the details of the technology and much more on the outcomes of using XR as part of training and treatment.
Other sectors might have more scope to experiment with what a technology can do, which is an exciting and often necessary part of the development cycle. But seeing it framed from the healthcare perspective was grounding.
It served as an important reminder for design to focus on people’s needs, and validated outcomes, rather than getting sucked into using cool new technologies for the sake of it.
All of which can perhaps be summarised as:
XR has the potential to revolutionise all aspects of the healthcare system with the ultimate benefit of improving patient care.
Developing an XR application can be relatively expensive. It requires the creation of a 3D environment and possibly the recruitment of actors.
There were some interesting questions trying to validate whether the XR experiences being demonstrated offered greater medical benefits than more traditional digital experiences. Could a 2D experience on an iPad do the job just as well, or better?
In some cases, it was clear that the XR experience could not be replicated without a headset. In other cases, it was less clear what the patient benefit is of performing an exercise in XR rather than following guidance on an iPad screen.
However, early observations and ongoing research seems to be suggesting the brain does process experience and information differently when it sees it in 3D.
Exactly why this might be the case seems to be a mystery, for now, but the XR experience does appear to promote stronger rehabilitation processes.
While early indicators suggest XR has an important role to play in the future of healthcare, important questions are being asked to ensure it is not introduced at the expense of certain age categories, abilities, or socio-economic groups.
In the case of XR being used as part of a rehabilitation programme at home this might be exclusionary to
No specific solutions were presented for these at the conference. But there is a broad awareness of the potential for XR to accelerate inequality and matching inequality with equity is very much at the forefront of thinking.
This conference made me aware how different the skills needed to design for XR are compared to the more traditional digital design experience I have.
There are challenges in human centred design for digital products and websites around accessibility, and the varying levels of comfort people have.
But there’s also more-or-less universal access to computers and smartphones, with accessibility tools and established testing methods.
So accessibility need not be a significant blocker in the ultimate uptake and usability of the products.
XR, I’m realising, is in a different place. The hardware isn’t available to most people. And as I understand things, software that runs on one VR headset might not run on another, further reducing the potential audience.
The hardware itself is rapidly improving and becoming more powerful and more comfortable to wear – but it’s a long way off being as easy to pick up and engage with as your mobile phone.
And when you do put on that headset, you enter an awkward situation of being immersed in a 3D XR world, while still being physically present in the real world. This can result in cyber sickness for some users.
And for some users, there can be problems performing the physical actions required within the XR world, within the constraints of their home. Essentially there’s a risk of tripping over the sofa or swinging an arm and breaking a vase.
There’s clearly a long way still to go before we figure out how to design truly human centred solutions for XR.
There’s also a clear consumer demand for this type of immersive experience and examples of how it is already delivering social benefits, including in healthcare.
As with all technology the capabilities are growing exponentially. I believe it’s only a matter of time before interaction with the hardware becomes less intrusive and engagement with XR is as low friction as engaging with your smartphone is now.
As a product designer, I’m still not sure what my next steps with XR are. But while some of the technical knowledge might change, the core design thinking skills remain the same.
It reminds me a bit of the early 2000s when I started building websites. It was a fun time to be involved in the web because there were no standards.
We were constantly discovering new and exciting opportunities for new types of content and how it could be presented.
But it was also terrible for accessibility and every website was a new experiment in approaches to navigation.
I see the same thing in XR. There’s huge potential in what it can do, but how people interact and experience it is still being understood.
There is a lot to be learned about how designers can present experiences to users.
Is it better to reuse existing user interface patterns because they’re familiar? Or look for something completely different and more appropriate to a 3D environment?
How do we make these new forms of interaction engaging and intuitive?
For me, this is the perfect blend of inspired creative thinking balanced with research, and applied in human-focused design.
Which, fortunately, is what we do best here at Sparck. So I know I am in good company on this learning journey.