Apple’s new Vision Pro headset pushes mixed reality to a new level of realism and accuracy with advanced technology.
Tim Cook made it clear that he believes augmented reality is the future. Long before lightweight, full-featured AR glasses are possible, mixed reality headsets can begin exploring these capabilities. Here’s how Apple’s Vision Pro headset could deliver the best XR experience ever.
Display and sound quality
Apple’s new headset promises incredibly sharp images shown on HDR microOLED displays with 4K resolution. This more than doubles the resolution of most VR headsets, and OLED offers vibrant color and true blacks.
Spatial audio is built-in along with a processor that scans room features and materials with what Apple calls “audio ray tracing” to optimize the sound quality.
That’s only part of the equation since mixed reality relies on multiple cameras and sensors to orient video and audio correctly in 3D space around you.
Cameras and sensors
The Apple Vision Pro features multiple high-resolution cameras to capture video. Several IR cameras and illuminators to track head and hand movements.
A LiDAR scanner and TrueDepth camera provide automatic 3D mapping of the environment.
Inside the Vision Pro, rings of IR cameras and illuminators track eye movements for intuitive image processing. For such intensive calculations, Apple includes a newly developed R1 chip to handle real-time processing. An M2 chip, the same type used in many of its Mac computers and the iPad Pro, takes care of graphics and general processing.
The R1 chip combines input from 12 cameras, five sensors, and six microphones, with a latency of just 12ms. Apple points out this is eight times faster than the blink of an eye.
This advanced technology should give the Vision Pro the best mixed reality passthrough view on the market. Apple’s demonstration videos support this idea, but we’ll reserve final judgment when the device ships and reviews begin to appear.
Rather than a one-directional solution, Apple provides a reverse passthrough as well. Apple EyeSight uses a lenticular lens to project a 3D image at the correct angle of the wearer’s eyes onto the front of the Vision Pro.
When someone gets near, your eyes will appear on the front, and you’ll be able to see out, creating a two-way, virtual transparency. It sounds unreal and hopefully renders convincingly. There is a danger the Vision Pro’s front display might give the wearer a creepy appearance known as the “uncanny valley.”
For FaceTime, Apple needed a digital version of the wearer’s face since the internal cameras see only the eyes while wearing the Vision Pro. The Vision Pro’s outside cameras can scan your face and create a digital Persona that Apple claims will dynamically match your facial movements with a realistic appearance. Matching the expressiveness of the human face could prove challenging, and we’re not certain if Apple will succeed.
A Persona has depth when viewed from another Vision Pro and is a traditional 2D image on an iPhone, iPad, and Mac. That means you can FaceTime someone while wearing Apple’s headset, and it might look remarkably similar to your own face.
This is just one of the many ways Apple’s XR headset might redefine our homes.