My Apple Vision Pro Journey, Part 1 : User Experience
One week ago, my Vision Pro arrived, and I’ve become even more excited than I was when it was announced. It was quite a challenge to get the device in Europe, however we were able to overcome the various hurdles, it took a while but we were able to get one. Since I need glasses I wasn’t able to get the Prescription Zeiss Optical Inserts, but I was able to order the Reader Optical inserts which was closest to my prescription, however I need to get a new set of Prescription inserts when they are available in Europe.
Apple Vision Pro at WWDC 2023
I watched the announcement , my initial thoughts were “this is an iPhone moment”. Since that time, I was able to try the device twice, each for 5-minute guided demos; after each demo, I was still impressed, but I didn’t get a chance to “Play”.
I read countless reviews online, and watched many YouTube videos presenting many first impressions and assessments of the device; both positive and negative however mostly negative; “Its too expensive!”, “There is no content”, “When you pick it up, the light shield comes off” and many more. There is some truth in a lot of the negative points that people are stating but do they matter? In my opinion; not really (at the moment). Why doesn’t it matter? Its quite simply the fact that this device has solved many problems that other devices have not even come close to solving and I want to focus on these things first.
Apple have launched a new device that finally proves that XR technology can add significant value to the way we work and the way we live our lives outside of work. Over the coming weeks I will write a series of blog posts on this subject focusing in on how the Apple Vision Pro addresses problems that other XR devices have been unable to address or failed to address. I will also write a post on what I think is missing, or limitations in the product that I think Apple need to address but I want to start with the positives especially the features that excite me the most. In part 1, perhaps the most important aspect of XR devices. User Experience.
User Experience on the Apple Vision Pro
In my personal life, and in my job Ive been lucky enough to try many of the XR headsets that have come on the market. Each one has come with its plus points and minus points, however with all of them, there was a learning curve for me, and most importantly, the people that I introduced it to.
Showing my age!
Ive been in this industry for many years now, I remember at the start of my career in the early 90’s I was sitting next to a Banking Executive, in front of a PC with a mouse; she considered herself computer literate having sat in front of an IBM 3070 Terminal for many years.
I told her to use the mouse pointer and click a button on the screen. She immediately picked the mouse up and placed it on the screen itself and clicked the button, the concept of moving the pointer on the screen to the button by moving the mouse on her desk was not obvious to her. More detailed instructions and an understanding of their past experiences were needed!
Over the last few years I have had similar experiences when I introduce devices like Microsoft’s Hololens, RealWear Headsets, Magic Leap, Recon Glasses and MetaQuest Headsets to people that have never worn or used an XR Device, even experienced IT folks have a steeper learning curve than you might think just to do basic interactions with the device. It is in this initial period of learning that frustration can build up relatively quickly that transforms initial excitement and that “Wow moment” into a negative experience and reluctance to use it. Having used the Microsoft Hololens for a few years, I am still frustrated when I have to repeat gestures multiple times because my first attempt was not recognised, or when I move a window to a place that I can no longer manipulate it how I want.
With the Vision Pro, I just knew how to use it for the basic things. I didn’t have to be told much before I was able to navigate the User Interface and use the applications. I did not experience the level of frustration that I had when I used other XR devices in the initial few hours of usage and the more I used it, I naturally learned how to use it more effectively. I quickly realised how to move Applications and Windows around to make them effective for me and my ways of working. The Eye Tracking on the Vision Pro works like magic, yes, initially you can easily make mistakes, and select the wrong things especially if controls are small and too close together, but you quickly learn to pull the window closer to you and you make fewer mistakes, and you can easily push that window back when you don’t have to interact with it so precisely. Its like the Vision Pro is teaching me how to use it more effectively, in a subliminal way!
Of course it depends somewhat on the base knowledge and experience of the user, however much less than any other XR Device that I have used; if you know how to use an iPhone or an iPad, you will be up and running on a Vision Pro very quickly. In fact, if you have experience of any multi touch device; it wont take you long to learn how to get the most out of the Vision Pro.
The magic of the Apple Vision Pro’s eye tracking capabilities cannot be underestimated; To interact with the device, most VR devices use controllers, and AR devices like the Hololens use the hand tracking, some degree of “gaze” and gestures.
With a mouse, there is an amount of standardisation in how to use it - at the most basic level, all mice track along a flat surface and have a left and right button, most people will understand how to use one.
Controllers, they need to go!
In my opinion, controllers are a clumsy barrier to both adoption and immersion. When I introduce a typical VR headset to new users the initial conversation is about how to hold and use the controllers.
There is no standard way to use a VR hand held controller - each vendor will have a different way of using it. In addition, how do you put the headset on with the controllers in your hands? If you already have the headset on, fumbling to grab the controllers and hold them in the right way is frustrating!
Using the Hololens is much easier, no controllers, however it has a limited field of view, your hands are always somewhat connected to UI elements, and depending on the lighting conditions and where your hands are, the gestures are successful, but maybe after multiple attempts.
The Vision Pro’s eye tracking capabilities allow:
1. Accuracy and Responsiveness : The Vision Pro uses multiple sensors and cameras to track the user's eye movements. This allows the device to respond quickly and precisely to where the user is looking, making interactions feel natural, seamless and intuitive.
2. Natural Navigation and Interaction : Eye tracking enables users to navigate the interface simply by looking at various elements on the screen. For instance, selecting an app or a button can be as easy as looking at it and confirming the action with a gesture or voice command. This reduces the need for physical controllers or extensive hand movements.
3. Next Level Immersion: By tracking eye movements, the Vision Pro can adjust the visual display to match the user's line of sight, enhancing the immersive experience. This is particularly beneficial in VR applications, where realistic rendering based on gaze direction can significantly improve realism.
4. Foveated Rendering: Eye tracking allows the Vision Pro to implement foveated rendering, a technique where the device renders the highest resolution only at the point where the user is looking, while reducing the resolution in peripheral areas. This optimises processing power and improves overall performance and battery life. There is also a disadvantage here, but a minor one. If you record, cast or share the experience the view in the photo, video or remote client application can appear blurred in parts when external people are looking at it.
5. Accessibility Features: Eye tracking also supports accessibility, enabling users with limited mobility to control the device through eye movements. This can make the Vision Pro more inclusive and easier to use for many more people.
The eye tracking technology in the Vision Pro significantly enhances the user experience by making interactions more natural and efficient. It reduces reliance on external controllers and brings a new level of immersion and realism to both augmented and virtual reality experiences. By leveraging its amazing array of sensors and processing capabilities, the Vision Pro’s eye tracking sets a new bar for user interface design in XR devices.
iPhone Launch
When Steve Jobs introduced the iPhone, he spoke about Apple’s “revolutionary” introduction of ways of controlling a device, the Mouse for the Mac, the Click Wheel for the iPod, and Multi-Touch for the iPhone.
Apple have now achieved a fourth - Eye Tracking for XR devices. Yes, other XR devices have tried to implement this, the difference is that Apple made it work accurately and made it easy and natural to use.
When I introduce the Vision Pro to people for the first time, will I be able to hand it to them and walk away? No. When I hand it to them, I wont need to spend as much time explaining how to use it, and its much less likely that their initial impressions of the device will cause the level of frustration that happens on other devices. When they take off the Vision Pro, they are much more likely to say “Wow!”