XR teenager needs Web3 data intelligence to grow

Man using a VR headset to access the metaverse

Extended Reality (XR) is currently in its teenage phase; growing quickly, full of ideas and ready to take on the world, but perhaps lacking the wisdom and knowledge to know themselves properly. It’s a path well-trodden by previous generations. TV industry measurement was decades behind audience growth and Google Analytics didn’t launch until 2005. Our XR teenager is growing into a persistent, interoperable and synchronous adult (the Metaverse) and needs spatial data intelligence to be the best they can be.

The computing platform shift from desktop/mobile screen interfaces (Web2) to more natural AR/VR human experiences (Web3 and spatial data) is a drum many of us have been banging for years. Physical and digital worlds are starting to blur, from holographic surgical training to 3D digital twins of entire cities, so it’s increasingly important to understand human behaviour in XR and reduce the knowledge gap with Web3 data analytics based on the future rather than the past.

Human behaviour in 3D spatial experiences is totally different to 2D screen interactions as it creates a sense of presence unique to XR. Having Six Degrees of Freedom (6DOF) to naturally move through and around AR/VR content isn’t the same as mouse clicks or touchscreen taps so there’s rich data insight if we look beyond legacy Web2 data analytics. What do users look at? How do they navigate the experience? Our teenager is inquisitive and progressive so dwell time and click-through-rates feel a bit 2015.

From the oldest medium (posters) to the newest (digital), analytics tools have evolved to measure the unique nature of each medium and standardised industry metrics have accelerated growth. Imagine having no web analytics or social monitoring tools to measure Web2 performance. Buzz monitoring (as some of us will remember) was like the Wild West with everyone scrambling to understand Facebook and Twitter as these platforms went stratospheric.

AR/VR is no different, so it’s time to recognise spatial data analytics as a critical element of growth and rite of passage to reach maturity. Tapping into sensor data from handsets and headsets yields billions of data points on head, hand and body movements in 3D space. Interpreting raw sensor data has been a cognitive science challenge however 5 years R&D on live projects has proven its accuracy and helped standardise metrics for attention and navigation as well as quantifying what on earth immersion means!

Google Analytics and social listening companies like Brandwatch helped standardise the measurement of Web2 so brands, businesses and investors could attribute ROI. CORTEXR does the same for Web3 data with standardised metrics and data visualisations for AR and VR irrespective of platform and device so you’re not limited by closed systems. It’s like ‘social listening for XR and the Metaverse’ albeit data for the Spatial Web.

Our teenager is approaching adulthood so our mission is to advance the metrics of human behaviour in XR and the Metaverse. It’s what the cool kids are doing.

Sign-up for latest XR news