Spatial tracking for Apple Vision Pro

Spatial Tracking XR Attention Heatmap Dinosaur Museum

Spatial data analytics ready for the Apple spatial computer

The Vision Pro is the Cupertino company’s biggest product launch since the Apple Watch and possibly more important than the success of the iPhone. Spatial computing is a significant platform shift to a post-screen world with natural human interfaces (eyes, hands and voice). Apple have taken their time to ensure the form factor feels right with first-hand reports of the Vision Pro as intuitive, comfortable and personal. Human experience design, in addition to tech specs, created a new standard in the smartphone and smartwatch market and the Vision Pro does the same for the XR headset market. Apple still have a long list of things they want to refine and developers will reimagine new experiences so the good news is spatial tracking for Apple’s new spatial computer is ready on day one. Our spatial data analytics platform is available now to AR/VR content creators developing the next generation of Extended Reality apps through the lens of human experience.

Unity plug-in delivers the metrics of human behaviour in XR

Apple’s new visionOS launches with Unity support so our existing Unity plug-in provides immediate access to spatial tracking (no code required) to analyse and optimise 3D experiences. Attention heatmaps tell you what people are looking at based on the time, sequence and priority of viewed areas to optimise the most important content (and eliminate poor performing elements). Navigation scatterplots show you how people move through content experiences based on their position and pathway in 3D environments to optimise the order and priority of events you want to promote. Add the metrics of human behaviour to the development of new Apple Vision Pro applications for a complete picture of user experience. Existing CORTEXR customers can also compare Apple’s headset experience to their other XR projects to analyse relative performance across all AR/VR headsets and iOS/Android handsets. Time will tell but we can’t wait to see how the Vision Pro user experience compares.

Spatial tracking respects Apple Vision Pro data privacy  

It may seem counterintuitive but XR data analytics doesn’t need intrusive data collection to understand user behaviour and measure AR/VR performance. Our end-to-end spatial data analytics platform was specifically built with Internet of Behaviour (IoB) technology to limit data access to 6DOF headset and handset movements in 3D space. The Apple Vision Pro has an equally admirable approach to data privacy with no data feed from the outward facing cameras and encrypted eye tracking data isn’t shared with third party apps. Spatial tracking for spatial computing therefore makes a lot of sense, especially when attentional data similar to eye tracking can be achieved without invading people’s biometric privacy. The other benefits of spatial tracking is the omnipresence of this data set across all AR/VR devices so it’s platform agnostic, scalable across all AR/VR projects and standardises the metrics of human behaviour in XR. 

One more thing…how does Apple Vision Pro make you feel?

The whirlwind of commentary about the Apple Vision Pro has sparked lots of debate within the XR community but all that matters is the human experience. First-hand experience of non-technical people about how it makes you feel trump tech spec comparisons. A good example is the debate about hand controllers which, let’s face it, is the same as using a computer mouse. The future of spatial computing is natural human interfaces (e.g. hands not hand controllers) and we think Apple’s entry into the XR headset market moves the industry closer to realising the potential.

Spatial tracking demo