On Monday 5 June, I hosted a viewing party for the Mesmerise Group to watch Apple’s WWDC keynote. After much anticipation and speculation, we saw the new ‘Apple Vision Pro’ unveiled at last! As a dedicated Mixed Reality team within Mesmerise, we’ve been working for some months now with the Magic Leap 2 and Meta Quest Pro as our primary devices, knowing that Apple would eventually bring their own device to market.
One thing that has become clear to us as a team are the real benefits of having a device that can seamlessly transition between Mixed and Virtual Reality, asking which environment would work best for the experience we are offering a user. So, we watched along with hopefulness of capabilities and features we kind of knew were coming, and with the expectation that Apple would no doubt surprise us with their take on a headset.
Let’s dive into some of the features and benefits of their new Mixed Reality headset:
First off, the name. For months we’d been thinking it was called ‘Apple Reality (Pro)’, but officially called the ‘Apple Vision Pro’. So, let’s break down what it can do, when it’ll be available and how accessible it’s expected to be.
It’s available early next year in the US, with other countries coming online in the following months.
It starts at $3,499, that’s before you might want to have alternate head straps, prescription-matching lens inserts, face-hugging light guards, and an external rechargeable battery pack – to be untethered from a Mac computer.
A new App store will be available for dedicated VisionOS (the new operating system), and optimised iPhone and iPad Apps that will work with Apple Vision Pro – viewable and interactable as a flat floating panel.
Its form is a sleek glass, aluminium, and textile design, and it follows rumours that it’s looks like ski goggles – with a see through, dimmable window at the front to boot.
Under the glass and inside the frame, it packs 12 cameras and sensors to make all the ‘magic’ possible. High-res video, LiDAR, light and spatial sound equipment are run by an M2 and new R1 chip to make sense of the vast amount of environmental data this device can capture.
Volumetric room scanning in both sound and visual arenas means your experience is truly tailored to you. You can control the degree of environmental input you can see by turning a ‘digital crown’ (from the Apple Watch), means you can have mixed to virtual reality immersion at will, with some predetermined environments to lose yourself in. At the same time and outside observers can see which mode you are in (MR/VR) via the display on the front of the device.
An iOS like eye-tracked interface guides you around the App selection, offering you both work and entertainment capabilities with the usual Apple finesse. Integration with a Mac is made simple by simply looking at the Mac’s screen and having it automatically represented in the 3D spatial environment alongside other apps running at the same time.
Foveated rendering, paired with high precision glass lenses and near invisible pixels, makes sure that what you are focussing on is always super sharp and crisp, making reading fine text a pleasant experience (or so we are told) through a 4k+ wrapped around display – 4k+ for each eye!
With the ability to blend MR and VR states with the Digital Crown, Apple have just shown that we now have a half virtual/half mixed super reality that is likely to change a lot of people’s thinking about space and environments for their content.
Software integration with third party apps means a whole new way to create a digital representation of yourself by using an intricate scanning system that automatically build a 3D model of you, ready to take cues from the internal facing cameras and sensors that will pick up on your expressions, and display them as a natural looking avatar in an all-new video calling app.
The main user interface is built on having flat panels floating in mid-air, mimicking that which would ordinarily be on your laptop or monitor screen, with 3D examples shown fitting in around these panels. This seems to provide a way for traditional workstations to become bigger, wider, no longer constrained to a small area on your desk, but instead open. It can be used all day if tethered to a Mac or for 2hrs a session of powered by and external battery pack.
Finally, (and only finally for this overview as there is lots more to unpack), there will be a new developer testing environment, prior to the launch of the device with an add on SDK dedicated to Apple Vision OS, that integrates with already released dev tools like RealityKit, Swift and full Unity integration from day one.
Very impressive and as expected, reassuringly expensive. But it does have more IO than you could wish for, far more out-of-the-box features than most of its competitors and opens the way for greater adoption of this type of device being used for serious productivity (and entertainment beyond gaming). No, it’s not the dream device that you hardly notice you are wearing, far from it, that is still a dream that needs to be realised.
One final takeaway:
I believe this device will greatly expand the awareness of Extended Reality (XR) devices, especially to those to think these types of devices have been reserved purely for gaming. I hope the Apple Vision Pro, will demonstrate to people how we are on the verge of a new type of computing device, no longer having the need for us to be crouched over a desktop or laptop computer, but instead standing, sitting, walking and more. A category of devices that now begin to offer us alternatives to video walls and 2D representations of objects and people. As we live our natural lives in a 3D world, we now have a glimpse into a world where our remote communications, entertainment and work can match us in our 3D existence.
I know we’ve not fully arrived at that destination, but the announcement by Apple shows that we are about to see new and exciting developments appear over the next few years as spatial computing becomes more and more tangible, and we begin to see the benefits it can offer us.
It’ll be a change, and one we need to get used to, but so far, the changes look well worth the effort to realise them.
Dan Clemo, Head of Mixed Reality. Follow Dan on LinkedIn to keep up with trends in immersive tech.